- Essentially all the news pertaining to the US federal funding of science continues to be awful. This article from Science summarizes the situation well, as does this from The Guardian and this editorial in the Washington Post. I do like the idea of a science fair of cancelled grants as a way to try to get alleged bipartisan appropriator notice of just how bad the consequences would be of the proposed cuts.
- On a more uplifting note, mathematicians have empirically demonstrated a conjecture originally made by John Conway, that it is possible to make a tetrahedral pyramid that, under gravity, has only one stable orientation. Quanta has a nice piece on this with a cool animated gif, and here is the actual preprint about it. It's all about mass distributions and moments of inertia about edges. As others have pointed out including the authors, this could be quite useful for situations like recent lunar lander attempts that seem to have a difficult time not topping over.
- A paper last week in Nature uses photons and a microcavity to try to test how long it takes photons to tunnel through a classically forbidden region. In this setup, it is mathematically legit to model the photons as if they have an effective mass, and one can model the barrier they need to traverse in terms of an effective potential energy. Classically, if the kinetic energy of the particle of interest is less than the potential energy of the barrier, the particle is forbidden inside the barrier. I've posted about the issue of tunneling time repeatedly over the years (see here for a 2020 post containing links), because I think it's a fascinating problem both conceptually and as a puzzle for experimentalists (how does one truly do a fair test of this?). The take-away from this paper is, the more classically forbidden the motion, the faster the deduced tunneling time. This has been seen in other experiments testing this idea. A key element of novelty in the new paper is the claim that the present experiment seems (according to the authors) to not be reasonably modeled by Bohmian mechanics. I'd need to read this in more depth to better understand it, as I had thought that Bohmian mechanics applied to problems like this is generally indistinguishable in predictions from conventional quantum mechanics, basically by design.
- In other non-condensed matter news, there is an interstellar comet transiting the solar system right now. This is very cool - it's only the third such object detected by humans, but to be fair we've only really been looking for a few years. This suggests that moderately sized hunks of material are likely passing through from interstellar space all the time, and the Vera C. Rubin Observatory will detect a boatload of them. My inner science fiction fan is hoping that the object changes its orbit at perihelion by mysterious means.
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Search This Blog
Tuesday, July 08, 2025
New updates + tetrahedra, tunneling times, and more
Monday, June 30, 2025
Science slow down - not a simple question
I participated in a program about 15 years ago that looked at science and technology challenges faced by a subset of the US government. I came away thinking that such problems fall into three broad categories.
- Actual science and engineering challenges, which require foundational research and creativity to solve.
- Technology that may be fervently desired but is incompatible with the laws of nature, economic reality, or both.
- Alleged science and engineering problems that are really human/sociology issues.
Part of science and engineering education and training is giving people the skills to recognize which problems belong to which categories. Confusing these can strongly shape the perception of whether science and engineering research is making progress.
There has been a lot of discussion in the last few years about whether scientific progress (however that is measured) has slowed down or stagnated. For example, see here:
https://www.theatlantic.com/science/archive/2018/11/diminishing-returns-science/575665/
https://news.uchicago.edu/scientific-progress-slowing-james-evans
https://theweek.com/science/world-losing-scientific-innovation-research
A lot of the recent talk is prompted by this 2023 study, which argues that despite the world having many more researchers than ever before (behold population growth) and more global investment in research, somehow "disruptive" innovations are coming less often, or are fewer and farther between these days. (Whether this is an accurate assessment is not a simple matter to resolve; more on this below.)
There is a whole tech bro culture that buys into this, however. For example, see this interview from last week in the New York Times with Peter Thiel, which points out that Thiel has been complaining about this for a decade and a half.
On some level, I get it emotionally. The unbounded future spun in a lot of science fiction seems very far away. Where is my flying car? Where is my jet pack? Where is my moon base? Where are my fusion power plants, my antigravity machine, my tractor beams, my faster-than-light drive? Why does the world today somehow not seem that different than the world of 1985, while the world of 1985 seems very different than that of 1945?
Some of the folks that buy into this think that science is deeply broken somehow - that we've screwed something up, because we are not getting the future they think we were "promised". Some of these people have this as an internal justification underpinning the dismantling of the NSF, the NIH, basically a huge swath of the research ecosystem in the US. These same people would likely say that I am part of the problem, and that I can't be objective about this because the whole research ecosystem as it currently exists is a groupthink self-reinforcing spiral of mediocrity.
Science and engineering are inherently human ventures, and I think a lot of these concerns have an emotional component. My take at the moment is this:
- Genuinely transformational breakthroughs are rare. They often require a combination of novel insights, previously unavailable technological capabilities, and luck. They don't come on a schedule.
- There is no hard and fast rule that guarantees continuous exponential technological progress. Indeed, in real life, exponential growth regimes never last. The 19th and 20th centuries were special. If we think of research as a quest for understanding, it's inherently hierarchal. Civilizational collapses aside, you can only discover how electricity works once. You can only discover the germ theory of disease, the nature of the immune system, and vaccination once (though in the US we appear to be trying really hard to test that by forgetting everything). You can only discover quantum mechanics once, and doing so doesn't imply that there will be an ongoing (infinite?) chain of discoveries of similar magnitude.
- People are bad at accurately perceiving rare events and their consequences, just like people have a serious problem evaluating risk or telling the difference between correlation and causation. We can't always recognize breakthroughs when they happen. Sure, I don't have a flying car. I do have a device in my pocket that weighs only a few ounces, gives me near-instantaneous access to the sum total of human knowledge, let's me video call people around the world, can monitor aspects of my fitness, and makes it possible for me to watch sweet videos about dogs. The argument that we don't have transformative, enormously disruptive breakthroughs as often as we used to or as often as we "should" is in my view based quite a bit on perception.
- Personally, I think we still have a lot more to learn about the natural world. AI tools will undoubtedly be helpful in making progress in many areas, but I think it is definitely premature to argue that the vast majority of future advances will come from artificial superintelligences and thus we can go ahead and abandon the strategies that got us the remarkable achievements of the last few decades.
- I think some of the loudest complainers (Thiel, for example) about perceived slowing advancement are software people. People who come from the software development world don't always appreciate that physical infrastructure and understanding are hard, and that there are not always clever or even brute-force ways to get to an end goal. Solving foundational problems in molecular biology or quantum information hardware or photonics or materials is not the same as software development. (The tech folks generally know this on an intellectual level, but I don't think all of them really understand it in their guts. That's why so many of them seem to ignore real world physical constraints when talking about AI.). Trying to apply software development inspired approaches to science and engineering research isn't bad as a component of a many-pronged strategy, but alone it may not give the desired results - as warned in part by this piece in Science this week.
More frequent breakthroughs in our understanding and capabilities would be wonderful. I don't think dynamiting the US research ecosystem is the way to get us there, and hoping that we can dismantle everything because AI will somehow herald a new golden age seems premature at best.
Saturday, June 28, 2025
Cryogenic CMOS - a key need for solid state quantum information processing
The basis for much of modern electronics is a set of silicon technologies called CMOS, which stands for complementary metal oxide semiconductor devices and processes. "Complementary" means using semiconductors (typically silicon) that is locally chemically doped so that you can have both n-type (carriers are negatively charged electrons in the conduction band) and p-type (carriers are positively charged holes in the valence band) material on the same substrate. With field-effect transistors (using oxide gate dielectrics), you can make very compact, comparatively low power devices like inverters and logic gates.
There are multiple different approaches to try to implement quantum information processing in solid state platforms, with the idea that the scaling lessons of microelectronics (in terms of device density and reliability) can be applied. I think that essentially all of these avenues require cryogenic operating conditions; all superconducting qubits need ultracold conditions for both superconductivity and to minimize extraneous quasiparticles and other decoherence sources. Semiconductor-based quantum dots (Intel's favorite) similarly need thermal perturbations and decoherence to be minimized. The wealth of solid state quantum computing research is the driver for the historically enormous (to me, anyway) growth of dilution refrigerator manufacturing (see my last point here).
So you eventually want to have thousands of error-corrected logical qubits at sub-Kelvin temperatures, which may involve millions of physical qubits at sub-Kelvin temperatures, all of which need to be controlled. Despite the absolute experimental fearlessness of people like John Martinis, you are not going to get this to work by running a million wires from room temperature into your dil fridge.
![]() |
Fig. 1 from here. |
In this most recent work, the aspect that I find most impressive is that the CMOS electronics are essentially a serious logic-based control board operating at milliKelvin temperatures right next to the chip with the qubits (in this case, spins-in-quantum-dots). I'm rather blown away that this works and with sufficiently low power dissipation that the fridge is happy. This is very impressive, and there is likely a very serious future in store for cryogenic CMOS.
Saturday, June 21, 2025
Brief items - fresh perspectives, some news bits
- First, it's a pleasure to see new long-form writing about condensed matter subjects, in an era where science blogging has unquestionably shrunk compared to its heyday. The new Quantum Matters substack by Justin Wilson (and William Shelton) looks like it will be a fun place to visit often.
- Similar in spirit, I've also just learned about the Knowmads podcast (here on youtube), put out by Prachi Garella and Bhavay Tyagi, two doctoral students at the University of Houston. Fun Interviews with interesting scientists about their science and how they get it done.
- There have been some additional news bits relevant to the present research funding/university-govt relations mess. Earlier this week, 200 business leaders published an open letter about how the slashing support for university research will seriously harm US economic competitiveness. More of this, please. I continue to be surprised by how quiet technology-related, pharma, and finance companies are being, at least in public. Crushing US science and engineering university research will lead to serious personnel and IP shortages down the line, definitely poor for US standing. Again, now is the time to push back on legislators about cuts mooted in the presidential budget request.
- The would-be 15% indirect cost rate at NSF has been found to be illegal, in a summary court judgment released yesterday. (Brief article here, pdf of the ruling here.)
- Along these lines, there are continued efforts for proposals about how to reform/alter indirect cost rates in a far less draconian manner. These are backed by collective organizations like the AAU and COGR. If you're interested in this, please go here, read the ideas, and give some feedback. (Note for future reference: the Joint Associations Group (JAG) may want to re-think their acronym. In local slang where I grew up, the word "jag" does not have pleasant connotations.)
- The punitive attempt to prevent Harvard from taking international students has also been stopped for now in the courts.
Sunday, June 15, 2025
So you want to build a science/engineering laboratory building
- The NSF awarded 500 more graduate fellowships this week, bringing the total for this year up to 1500. (Apologies for the X link.) This is still 25% lower than last year's number, and of course far below the original CHIPS and Science act target of 3000, but it's better than the alternative. I think we can now all agree that the supposed large-scale bipartisan support for the CHIPS and Science act was illusory.
- There seems to be some initial signs of pushback on the senate side regarding the proposed massive science funding cuts. Again, now is the time to make views known to legislators - I am told by multiple people with experience in this arena that it really can matter.
- There was a statement earlier this week that apparently the US won't be going after Chinese student visas. This would carry more weight if it didn't look like US leadership was wandering ergodically through all possible things to say with no actual plan or memory.
- Any big laboratory building should have a dedicated loading dock with central receiving. If you're spending $100M-200M on a building, this is not something that you should "value engineer" away. The long term goal is a building that operates well for the PIs and is easy to maintain, and you're going to need to be able to bring in big crates for lab and service equipment. You should have a freight elevator adjacent to the dock.
- You should also think hard about what kind of equipment will have to be moved in and out of the building when designing hallways, floor layouts, and door widths. You don't want to have to take out walls, doorframes, or windows, or to need a crane to hoist equipment into upper floors because it can't get around corners.
- Think hard about process gasses and storage tanks at the beginning. Will PIs need to have gas cylinders and liquid nitrogen and argon tanks brought in and out in high volumes all the time, with all the attendant safety concerns? Would you be better off getting LN2 or LAr tanks even though campus architects will say they are unsightly?
- Likewise, consider whether you should have building-wide service for "lab vacuum", N2 gas, compressed air, DI water, etc. If not and PIs have those needs, you should plan ahead to deal with this.
- Gas cylinder and chemical storage - do you have enough on-site storage space for empty cylinders and back-up supply cylinders? If this is a very chemistry-heavy building, think hard about safety and storing solvents.
- Make sure you design for adequate exhaust capacity for fume hoods. Someone will always want to add more hoods. While all things are possible with huge expenditures, it's better to make sure you have capacity to spare, because adding hoods beyond the initial capacity would likely require a huge redo of the building HVAC systems.
- Speaking of HVAC, think really hard about controls and monitoring. Are you going to have labs that need tight requirements on temperature and humidity? When you set these up, put have enough sensors of the right types in the right places, and make sure that your system is designed to work even when the outside air conditions are at their seasonal extremes (hot and humid in the summer, cold and dry in the winter). Also, consider having a vestibule (air lock) for the main building entrance - you'd rather not scoop a bunch of hot, humid air (or freezing, super-dry air) into the building every time a student opens the door.
- Still on HVAC, make sure that power outages and restarts don't lead to weird situations like having the whole building at negative pressure relative to the outside, or duct work bulging or collapsing.
- Still on HVAC, actually think about where the condensate drains for the fan units will overflow if they get plugged up or overwhelmed. You really don't want water spilling all over a rack of networking equipment in an IT closet. Trust me.
- Chilled water: Whether it's the process chilled water for the air conditioning, or the secondary chilled water for lab equipment, make sure that the loop is built correctly. Incompatible metals (e.g., some genius throws in a cast iron fitting somewhere, or joints between dissimilar metals) can lead to years and years of problems down the line. Make sure lines are flushed and monitored for cleanliness, and have filters in each lab that can be checked and maintained easily.
- Electrical - design with future needs in mind. If possible, it's a good idea to have PI labs with their own isolation transformers, to try to mitigate inter-lab electrical noise issues. Make sure your electrical contractors understand the idea of having "clean" vs. "dirty" power and can set up the grounding accordingly while still being in code.
- Still on electrical, consider building-wide surge protection, and think about emergency power capacity. For those who don't know, emergency power is usually a motor-generator that kicks in after a few seconds to make sure that emergency lighting and critical systems (including lab exhaust) keep going.
- Ceiling heights, duct work, etc. - It's not unusual for some PIs to have tall pieces of equipment. Think about how you will accommodate these. Pits in the floors of basement labs? 5 meter slab-to-slab spacing? Think also about how ductwork and conduits are routed. You don't want someone to tell you that installation of a new apparatus is going to cost a bonus $100K because shifting a duct sideways by half a meter will require a complete HVAC redesign.
- Think about the balance between lab space and office space/student seating. No one likes giant cubicle farm student seating, but it does have capacity. In these days of zoom and remote access to experiments, the way students and postdocs use offices is evolving, which makes planning difficult. Health and safety folks would definitely prefer not to have personnel effectively headquartered directly in lab spaces. Seriously, though, when programming a building, you need to think about how many people per PI lab space will need places to sit. I have yet to see a building initially designed with enough seating to handle all the personnel needs if every PI lab were fully occupied and at a high level of research activity.
- Think about maintenance down the line. Every major building system has some lifespan. If a big air handler fails, is it accessible and serviceable, or would that require taking out walls or cutting equipment into pieces and disrupting the entire building? Do you want to set up a situation where you may have to do this every decade? (Asking for a friend.)
- Entering the realm of fantasy, use your vast power and influence to get your organization to emphasize preventative maintenance at an appropriate level, consistently over the years. Universities (and national labs and industrial labs) love "deferred maintenance" because kicking the can down the road can make a possible cost issue now into someone else's problem later. Saving money in the short term can be very tempting. It's also often easier and more glamorous to raise money for the new J. Smith Laboratory for Physical Sciences than it is to raise money to replace the HVAC system in the old D. Jones Engineering Building. Avoid this temptation, or one day (inevitably when times are tight) your university will notice that it has $300M in deferred maintenance needs.
Saturday, June 07, 2025
A precision measurement science mystery - new physics or incomplete calculations?
Sunday, June 01, 2025
Pushing back on US science cuts: Now is a critical time
Every week has brought more news about actions that, either as a collateral effect or a deliberate goal, will deeply damage science and engineering research in the US. Put aside for a moment the tremendously important issue of student visas (where there seems to be a policy of strategic vagueness, to maximize the implicit threat that there may be selective actions). Put aside the statement from a Justice Department official that there is a general plan is to "bring these universities to their knees", on the pretext that this is somehow about civil rights.
The detailed version of the presidential budget request for FY26 is now out (pdf here for the NSF portion). If enacted, it would be deeply damaging to science and engineering research in the US and the pipeline of trained students who support the technology sector. Taking NSF first: The topline NSF budget would be cut from $8.34B to $3.28B. Engineering would be cut by 75%, Math and Physical Science by 66.8%. The anticipated agency-wide success rate for grants would nominally drop below 7%, though that is misleading (basically taking the present average success rate and cutting it by 2/3, while some programs are already more competitive than others.). In practice, many programs already have future-year obligations, and any remaining funds will have to go there, meaning that many programs would likely have no awards at all in the coming fiscal year. The NSF's CAREER program (that agency's flagship young investigator program) would go away This plan would also close one of the LIGO observatories (see previous link). (This would be an extra bonus level of stupid, since LIGO's ability to do science relies on having two facilities, to avoid false positives and to identify event locations in the sky. You might as well say that you'll keep an accelerator running but not the detector.) Here is the table that I think hits hardest, dollars aside:
The number of people involved in NSF activities would drop by 240,000. The graduate research fellowship program would be cut by more than half. The NSF research training grant program (another vector for grad fellowships) would be eliminated.The situation at NIH and NASA is at least as bleak. See here for a discussion from Joshua Weitz at Maryland which includes this plot:
This proposed dismantling of US research and especially the pipeline of students who support the technology sector (including medical research, computer science, AI, the semiconductor industry, chemistry and chemical engineering, the energy industry) is astonishing in absolute terms. It also does not square with the claim of some of our elected officials and high tech CEOs to worry about US competitiveness in science and engineering. (These proposed cuts are not about fiscal responsibility; just the amount added in the proposed DOD budget dwarfs these cuts by more than a factor of 3.)
If you are a US citizen and think this is the wrong direction, now is the time to talk to your representatives in Congress. In the past, Congress has ignored presidential budget requests for big cuts. The American Physical Society, for example, has tools to help with this. Contacting legislators by phone is also made easy these days. From the standpoint of public outreach, Cornell has an effort backing large-scale writing of editorials and letters to the editor.
Thursday, May 29, 2025
Quick survey - machine shops and maker spaces
Thursday, May 22, 2025
How badly has NSF funding already been effectively cut?
This NY Times feature lets you see how each piece of NSF's funding has been reduced this year relative to the normalized average spanning in the last decade. Note: this fiscal year, thanks to the continuing resolution, the actual agency budget has not actually been cut like this. They are just not spending congressionally appropriated agency funds. The agency, fearing/assuming that its budget will get hammered next fiscal year, does not want to start awards that it won't be able to fund in out-years. The result is that this is effectively obeying in advance the presidential budget request for FY26. (And it's highly likely that some will point to unspent funds later in the year and use that as a justification for cuts, when in fact it's anticipation of possible cuts that has led to unspent funds. I'm sure the Germans have a polysyllabic word for this. In English, "Catch-22" is close.)
I encourage you to click the link and go to the article where the graphic is interactive (if it works in your location - not sure about whether the link works internationally). The different colored regions are approximately each of the NSF directorates (in their old organizational structure). Each subsection is a particular program.
Sunday, May 18, 2025
A science anecdote palate cleanser
Apologies for slow posting. Real life has been very intense, and I also was rather concerned when one of my readers mentioned last weekend that these days my blog was like concentrated doom-scrolling. I will have more to say about the present university research crisis later, but first I wanted to give a hopefully diverting example of the kind of problem-solving and following-your-nose that crops up in research.
Recently in my lab we have had a need to measure very small changes in electrical resistance of some devices, at the level of a few milliOhms out of kiloOhms - parts in \(10^6\). One of my students put together a special kind of resistance bridge to do this, and it works very well. Note to interested readers: if you want to do this, make sure that you use components with very low temperature coefficients of their properties (e.g., resistors with a very small \(dR/dT\)), because otherwise your bridge becomes an extremely effective thermometer for your lab. It’s kind of cool to be able to see the lab temperature drift around by milliKelvins, but it's not great for measuring your sample of interest.
There are a few ways to measure resistance. The simplest is the two-terminal approach, where you drive currents through and measure voltages across your device with the same two wires. This is easy, but it means that the voltage you measure includes contributions from the contacts those wires make with the device. A better alternative is the four-terminal method, where you use separate wires to supply/collect the current.
Anyway, in the course of doing some measurements of a particular device's resistance as a function of magnetic field at low temperatures, we saw something weird. Below some rather low temperatures, when we measured in a 2-terminal arrangement, we saw a jump up in resistance by around 20 milliOhms (out of a couple of kOhms) as magnetic field was swept up from zero, and a small amount of resistance hysteresis with magnetic field sweep that vanished above maybe 0.25 T. This vanished completely in a 4-terminal arrangement, and also disappeared above about 3.4 K. What was this? Turns out that I think we accidentally rediscovered the superconducting transition in indium. While our contact pads on our sample mount looked clean to the unaided eye, they had previously had indium on there. The magic temperature is very close to the bulk \(T_{c}\) for indium.
For one post, rather than dwelling on the terrible news about the US science ecosystem, does anyone out there have other, similar fun experimental anecdotes? Glitches that turned out to be something surprising? Please share in the comments.
Monday, May 05, 2025
Updates, thoughts about industrial support of university research
- NSF has now frozen all funding for new and continuing awards. This is not good; just how bad it is depends on the definition of "until further notice".
- Here is an open letter from the NSF employees union to the basically-silent-so-far National Science Board, asking for the NSB to support the agency.
- Here is a grass roots SaveNSF website with good information and suggestions for action - please take a look.
- NSF also wants to cap indirect cost rates at 15% for higher ed institutions for new awards. This will almost certainly generate a law suit from the AAU and others.
- Speaking of the AAU, last week there was a hearing in the Massachusetts district court regarding the lawsuits about the DOE setting indirect cost rates to 15% for active and new awards. There had already been a temporary restraining order in place nominally stopping the change; the hearing resulted in that order being extended "until a further order is issued resolving the request for a temporary injunction." (See here, the entry for April 29.)
- In the meantime, the presidential budget request has come out, and if enacted it would be devastating to the science agencies. Proposed cuts include 55% to NSF, 40% to NIH, 33% to USGS, 25% to NOAA, etc. If these cuts went through, we are taking about more than $35B, at a rough eyeball estimate.
- And here is a letter from former NSF directors and NSB chairs to the appropriators in Congress, asking them to ignore that budget request and continue to support government sponsored science and engineering research.
Friday, April 25, 2025
NSF, quo vadis?
There is a lot going on. Today, some words about NSF.
Yesterday Sethuraman Panchanathan, the director of the National Science Foundation, resigned 16 months before the end of his six year term. The relevant Science article raises the possibility that this is because, as an executive branch appointee, he would effectively have to endorse the upcoming presidential budget request, which is rumored to be a 55% cut to the agency budget (from around $9B/yr to $4B/yr) and a 50% reduction in agency staffing. (Note: actual appropriations are set by Congress, which has ignored presidential budget requests in the past.) This comes at the end of a week when all new awards were halted at the agency while non-agency personnel conducted "a second review" of all grants, and many active grants have been terminated. Bear in mind, awards this year from NSF are already down 50% over last year, even without official budget cuts. Update: Here is Nature's reporting from earlier today.
The NSF has been absolutely critical to a long list of scientific and technological advances over the last 70 years (see here while it's still up). As mentioned previously, government support of basic research has a great return on investment for the national economy, and it's a tiny fraction of government spending. Less than three years ago, the CHIPS & Science Act was passed with supposed bipartisan support in Congress, authorizing the doubling of the NSF budget. Last summer I posted in frustration that this support seemed to be an illusion when it came to actual funding.
People can have disagreements about the "right" level of government support for science in times of fiscal challenges, but as far as I can tell, no one (including and especially Congress so far) voted for the dismantling of the NSF. If you think the present trajectory is wrong, contact your legislators and make your voices heard.
Sunday, April 20, 2025
A Grand Bargain and its chaotic dissolution
A couple of days ago, the New York Times published a related figure, showing the growth in dollars of total federal funds sent to US universities, but I think this is a more meaningful graph (hat tip to Prof. Elizabeth Popp Berman at Michigan for her discussion of this). In 2021, federal investment in research (the large majority of which is happening at universities) as a percentage of GDP was at its lowest level since 1953, and it was sinking further even before this year (for those worried about US competitiveness.... Also, industry does a lot more D than they do long-term R.). There are many studies by economists showing that federal investment in research has a large return (for example, here is one by the Federal Reserve Bank of Dallas saying that returns to the US economy on federal research expenditures are between 150% and 300%). Remember, these funds are not just given to universities - they are in the form of grants and contracts, for which specific work is done and reported. These investments also helped make US higher education the envy of much of the world and led to education of international students as a tremendous effective export business for the country.
Tuesday, April 15, 2025
Talk about "The Direct Democracy of Matter"
The Scientia Institute at Rice sponsors series of public lectures annually, centered around a theme. The intent is to get a wide variety of perspectives spanning across the humanities, social sciences, arts, sciences, and engineering, presented in an accessible way. The youtube channel with recordings of recent talks is here.
This past year, the theme was "democracy" in its broadest sense. I was honored to be invited last year to contribute a talk, which I gave this past Tuesday, following a presentation by my CS colleague Rodrigo Ferreira about whether AI has politics. Below I've embedded the video, with the start time set where I begin (27:00, so you can rewind to see Rodrigo).
Which (macroscopic) states of matter to we see? The ones that "win the popular vote" of the microscopic configurations.
Sunday, April 13, 2025
US science situation updates and what's on deck
- There have been very large scale personnel cuts across HHS, FDA, CDC, NIH - see here. This includes groups like the people who monitor lead in drinking water.
- There is reporting about the upcoming presidential budget requests about NASA and NOAA. The requested cuts are very deep. To quote Eric Berger's article linked above, for the science part of NASA, "Among the proposals were: A two-thirds cut to astrophysics, down to $487 million; a greater than two-thirds cut to heliophysics, down to $455 million; a greater than 50 percent cut to Earth science, down to $1.033 billion; and a 30 percent cut to Planetary science, down to $1.929 billion." The proposed cuts to NOAA are similarly deep, seeking to end climate study in the agency, as Science puts it. The full presidential budget request, including NSF, DOE, NIST, etc. is still to come. Remember, Congress in the past has often essentially ignored presidential budget requests. It is unclear if the will exists to do so now.
- Speaking of NSF, the graduate research fellowship program award announcements for this year came out this past week. The agency awarded slightly under half as many of these prestigious 3-year fellowships as in each of the last 15 years. I can only presume that this is because the agency is deeply concerned about its budgets for the next couple of fiscal years.
- Grants are being frozen at several top private universities - these include Columbia (new cancellations), the University of Pennsylvania (here), Harvard (here), Northwestern and Cornell (here), and Princeton (here). There are various law suits filed about all of these. Princeton and Harvard have been borrowing money (issuing bonds) to partly deal with the disruption as litigation continues. The president of Princeton has been more vocal than many about this.
- There has been a surge in visa revocations and unannounced student status changes in SEVIS for international students in the US. To say that this is unsettling is an enormous understatement. See here for a limited discussion. There seems to be deep reluctance for universities to speak out about this, presumably from the worry that saying the wrong thing will end up placing their international students and scholars at greater exposure.
- On Friday evening, the US Department of Energy put out a "policy flash", stating that indirect cost rates on its grants would be cut immediately to 15%. This sounds familiar. Legal challenges are undoubtedly beginning.
- Added bonus: According to the Washington Post, DOGE (whatever they say they are this week) is now in control of grants.gov, the website that posts funding opportunities. As the article says, "Now the responsibility of posting these grant opportunities is poised to rest with DOGE — and if its employees delay those postings or stop them altogether, 'it could effectively shut down federal-grant making,' said one federal official who spoke on the condition of anonymity to describe internal operations."
Saturday, April 12, 2025
What is multiferroicity?
(A post summarizing recent US science-related events will be coming later. For now, here is my promised post about multiferroics, inspired in part by a recent visit to Rice by Yoshi Tokura.)
Electrons carry spins and therefore magnetic moments (that is, they can act in some ways like little bar magnets), and as I was teaching undergrads this past week, under certain conditions some of the electrons in a material can spontaneously develop long-range magnetic order. That is, rather than being, on average, randomly oriented, instead below some critical temperature the spins take on a pattern that repeats throughout the material. In the ordered state, if you know the arrangement of spins in one (magnetic) unit cell of the material, that pattern is repeated over many (perhaps all, if the system is a single domain) the unit cells. In picking out this pattern, the overall symmetry of the material is lowered compared to the non-ordered state. (There can be local moment magnets, when the electrons with the magnetic moments are localized to particular atoms; there can also be itinerant magnets, when the mobile electrons in a metal take on a net spin polarization.) The most famous kind of magnetic order is ferromagnetism, when the magnetic moments spontaneously align along a particular direction, often leading to magnetic fields projected out of the material. Magnetic materials can be metals, semiconductors, or insulators.
In insulators, an additional kind of order is possible, based on electric polarization, \(\mathbf{P}\). There is subtlety about defining polarization, but for the purposes of this discussion, the question is whether the atoms within each unit cell bond appropriately and are displaced below some critical temperature to create a net electric dipole moment, leading to ferroelectricity. (Antiferroelectricity is also possible.) Again, the ordered state has lower symmetry than the non-ordered state. Ferroelectric materials have some interesting applications.
BiFeO3, a multiferroic antiferromagnet, image from here. |
Multiferroics are materials that have simultaneous magnetic order and electric polarization order. A good recent review is here. For applications, obviously it would be convenient if both the magnetic and polarization ordering happened well above room temperature. There can be deep connections between the magnetic order and the electric polarization - see this paper, and this commentary. Because of these connections, the low energy excitations of multiferroics can be really complicated, like electromagnons. Similarly, there can be combined "spin textures" and polarization textures in such materials - see here and here. Multiferroics raise the possibility of using applied voltages (and hence electric fields) to flip \(\mathbf{P}\), and thus toggle around \(\mathbf{M}\). This has been proposed as a key enabling capability for information processing devices, as in this approach. These materials are extremely rich, and it feels like their full potential has not yet been realized.
Sunday, March 30, 2025
Science updates - brief items
Here are a couple of neat papers that I came across in the last week. (Planning to write something about multiferroics as well, once I have a bit of time.)
- The idea of directly extracting useful energy from the rotation of the earth sounds like something out of an H. G. Wells novel. At a rough estimate (and it's impressive to me that AI tools are now able to provide a convincing step-by-step calculation of this; I tried w/ gemini.google.com) the rotational kinetic energy of the earth is about \(2.6 \times 10^{29}\) J. The tricky bit is, how do you get at it? You might imagine constructing some kind of big space-based pick-up coil and getting some inductive voltage generation as the earth rotates its magnetic field past the coil. Intuitively, though, it seems like while sitting on the (rotating) earth, you should in some sense be comoving with respect to the local magnetic field, so it shouldn't be possible to do anything clever that way. It turns out, though, that Lorentz forces still apply when moving a wire through the axially symmetric parts of the earth's field. This has some conceptual contact with Faraday's dc electric generator. With the right choice of geometry and materials, it is possible to use such an approach to extract some (tiny at the moment) power. For the theory proposal, see here. For an experimental demonstration, using thermoelectric effects as a way to measure this (and confirm that the orientation of the cylindrical shell has the expected effect), see here. I need to read this more closely to decide if I really understand the nuances of how it works.
- On a completely different note, this paper came out on Friday. (Full disclosure: The PI is my former postdoc and the second author was one of my students.) It's an impressive technical achievement. We are used to the fact that usually macroscopic objects don't show signatures of quantum interference. Inelastic interactions of the object with its environment effectively suppress quantum interference effects on some time scale (and therefore some distance scale). Small molecules are expected to still show electronic quantum effects at room temperature, since they are tiny and their electronic levels are widely spaced, and here is a review of what this could do in electronic measurements. Quantum interference effects should also be possible in molecular vibrations at room temperature, and they could manifest themselves through the vibrational thermal conduction through single molecules, as considered theoretically here. This experimental paper does a bridge measurement to compare the thermal transport between a single-molecule-containing junction between a tip and a surface, and an empty (farther spaced) twin tip-surface geometry. They argue that they see differences between two kinds of molecules that originate from such quantum interference effects.
Thursday, March 20, 2025
March Meeting 2025, Day 4 and wrap-up
I saw a couple of interesting talks this morning before heading out:
- Alessandro Chiesa of Parma spoke about using spin-containing molecules potentially as qubits, and about chiral-induced spin selectivity (CISS) in electron transfer. Regarding the former, here is a review. Spin-containing molecules can have interesting properties as single qubits, or, for spins higher than 1/2, qudits, with unpaired electrons often confined to a transition metal or rare earth ion somewhat protected from the rest of the universe by the rest of the molecule. The result can be very long coherence times for their spins. Doing multi-qubit operations is very challenging with such building blocks, however. There are some theory proposals and attempts to couple molecular qubits to superconducting resonators, but it's tough! Regarding chiral induced spin selectivity, he discused recent work trying to use molecules where a donor region is linked to an acceptor region via a chiral bridge, and trying to manipulate spin centers this way. A question in all the CISS work is, how can the effects be large when spin-orbit coupling is generally very weak in light, organic molecules? He has a recent treatment of this, arguing that if one models the bridge as a chain of sites with large \(U/t\), where \(U\) is the on-site repulsion energy and \(t\) is the hopping contribution, then exchange processes between sites can effectively amplify the otherwise weak spin-orbit effects. I need to read and think more about this.
- Richard Schlitz of Konstanz gave a nice talk about some pretty recent research using a scanning tunneling microscope tip (with magnetic iron atoms on the end) to drive electron paramagnetic resonance in a single pentacene molecule (sitting on MgO on Ag, where it tends to grab an electron from the silver and host a spin). The experimental approach was initially explained here. The actual polarized tunneling current can drive the resonance, and exactly how depends on the bias conditions. At high bias, when there is strong resonant tunneling, the current exerts a damping-like torque, while at low bias, when tunneling is far off resonance, the current exerts a field-like torque. Neat stuff.
- Leah Weiss from Chicago gave a clear presentation about not-yet-published results (based on earlier work), doing optically detected EPR of Er-containing molecules. These condense into mm-sized molecular crystals, with the molecular environment being nice and clean, leading to very little inhomogeneous broadening of the lines. There are spin-selective transitions that can be driven using near telecom-wavelength (1.55 \(\mu m\)) light. When the (anisotropic) \(g\)-factors of the different levels are different, there are some very promising ways to do orientation-selective and spin-selective spectroscopy. Looking forward to seeing the paper on this.
- I'm not sold on the combined March/April meeting. Six years ago when I was a DCMP member-at-large, the discussion was all about how the March Meeting was too big, making it hard to find and get good deals on host sites, and maybe the meeting should split. Now they've made it even bigger. Doesn't this make planning more difficult and hosting more expensive since there are fewer options? (I'm not an economist, but....) A benefit for the April meeting attendees is that grad students and postdocs get access to the career/networking events held at the MM. If you're going to do the combination, then it seems like you should have the courage of your convictions and really mingle the two, rather than keeping the March talks in the convention center and the April talks in site hotels.
- I understand that van der Waals/twisted materials are great laboratories for physics, and that topological states in these are exciting. Still, by my count there were 7 invited sessions broadly about this topic, and 35 invited talks on this over four days seems a bit extreme.
- By my count, there were eight dilution refrigerator vendors at the exhibition (Maybell, Bluefors, Ice, Oxford, Danaher/Leiden, Formfactor, Zero-Point Cryo, and Quantum Design if you count their PPMS insert). Wow.
Wednesday, March 19, 2025
March Meeting 2025, Day 3
Another busy day at the APS Global Physics Summit. Here are a few highlights:
- Shahal Ilani of the Weizmann gave an absolutely fantastic talk about his group's latest results from their quantum twisting microscope. In a scanning tunneling microscope, because tunneling happens at an atomic-scale location between the tip and the sample, the momentum in the transverse direction is not conserved - that is, the tunneling averages over a huge range of \(\mathbf{k}\) vectors for the tunneling electron. In the quantum twisting microscope, electrons tunnel from a flat (graphite) patch something like \(d \sim\) 100 nm across, coherently, through a couple of layers of some insulator (like WSe2) and into a van der Waals sample. In this case, \(\mathbf{k}\) in the plane is comparatively conserved, and by rotating the sample relative to the tip, it is possible to build up a picture of the sample's electronic energy vs. \(\mathbf{k}\) dispersion, rather like in angle-resolved photoemission. This has allowed, e.g., mapping of phonons via inelastic tunneling. His group has applied this to magic angle twisted bilayer graphene, a system that has a peculiar combination of properties, where in some ways the electrons act like very local objects, and in other ways they act like delocalized objects. The answer seems to be that this system at the magic angle is a bit of an analog of a heavy fermion system, where there are sort of local moments (living in very flat bands) interacting and hybridizing with "conduction" electrons (bands crossing the Fermi level at the Brillouin zone center). The experimental data (movies of the bands as a function of energy and \(\mathbf{k}\) in the plane as the filling is tuned via gate) are gorgeous and look very much like theoretical models.
- I saw a talk by Roger Melko about applying large language models to try to get efficient knowledge of many-body quantum states, or at least the possible outputs of evolution of a quantum system like a quantum computer based on Rydberg atoms. It started fairly pedagogically, but I confess that I got lost in the AI/ML jargon about halfway through.
- Francis M. Ross, recipient of this year's Keithley Award, gave a great talk about using transmission electron microscopy to watch the growth of materials in real time. She had some fantastic videos - here is a review article about some of the techniques used. She also showed some very new work using a focused electron beam to make arrays of point defects in 2D materials that looks very promising.
- Steve Kivelson, recipient of this year's Buckley Prize, presented a very nice talk about his personal views on the theory of high temperature superconductivity in the cuprates. One basic point: these materials are balancing between multiple different kinds of emergent order (spin density waves, charge density waves, electronic nematics, perhaps pair density waves). This magnifies the effects of quenched disorder, which can locally tip the balance one way or another. Recent investigations of the famous 2D square lattice Hubbard model show this as well. He argues that the ground state of the Hubbard model for a broad range \(1/2 < U/t < 8\), where \(U\) is the on-site repulsion and \(t\) is the hopping term, the ground state is in fact a charge density wave, not a superconductor. However, if there is some amount of disorder in the form of \(\delta t/t \sim 0.1-0.2\), the result is a robust, unavoidable superconducting state. He further argues that increasing the superconducting transition temperature requires striking a balance between the underdoped case (strong pairing, weak superfluid phase stiffness) and the overdoped case (weak pairing, strong superfluid stiffness), and that one way to achieve this would be in a bilayer with broken mirror symmetry (say different charge reservoir layers above and below, and/or a big displacement field perpendicular to the plane). (Apologies for how technical that sounded - hard to reduce that one to something super accessible without writing much more.)
March Meeting 2025, Day 2
I spent a portion of today catching up with old friends and colleagues, so fewer highlights, but here are a couple:
- Like a few hundred other people, I went to the invited talk by Chetan Nayak, leader of Microsoft's quantum computing effort. It was sufficiently crowded that the session chair warned everyone about fire code regulations and that people should not sit on the floor blocking the aisles. To set the landscape: Microsoft's approach to quantum computing is to develop topological qubits based on interesting physics that is predicted to happen (see here and here) if one induces superconductivity (via the proximity effect) in a semiconductor nanowire with spin-orbit coupling. When the right combination of gate voltage and external magnetic field is applied, the nanowire should cross into a topologically nontrivial state with majorana fermions localized to each end of the nanowire, leading to "zero energy states" seen as peaks in the conductance \(dI/dV\) centered at zero bias (\(V=0\)). A major challenge is that disorder in these devices can lead to other sources of zero-bias peaks (Andreev bound states). A 2023 paper outlines a protocol that is supposed to give good statistical feedback on whether a given device is in the topologically interesting or trivial regime. I don't want to rehash the history of all of this. In a paper published last month, a single proximitized, gate-defined InAs quantum wire is connected to a long quantum dot to form an interferometer, and the capacitance of that dot is sensed via RF techniques as a function of the magnetic flux threading the interferometer, showing oscillations with period \(h/2e\), interpreted as charge parity oscillations of the proximitized nanowire. In new data, not yet reported in a paper, Nayak presented measurements on a system comprising two such wires and associated other structures. The argument is that each wire can be individually tuned simultaneously into the topologically nontrivial regime via the protocol above. Then interferometer measurements can be performed in one wire (the Z channel) and in a configuration involving two ends of different wires (the X channel), and they interpret their data as early evidence that they have achieved the desired majorana modes and their parity measurements. I look forward to when a paper is out on this, as it is hard to make informed statements about this based just on what I saw quickly on slides from a distance.
- In a completely different session, Garnet Chan gave a very nice talk about applying advanced quantum chemistry and embedding techniques to look at some serious correlated materials physics. Embedding methods are somewhat reminiscent of mean field theories: Instead of trying to solve the Schrödinger equation for a whole solid, for example, you can treat the solid as a self-consistent theory of a unit cell or set of unit cells embedded in a more coarse-grained bath (made up of other unit cells appropriately averaged). See here, for example. He presented recent results on computing the Kondo effect of magnetic impurities in metals, understanding the trends of antiferromagnetic properties of the parent cuprates, and trying to describe superconductivity in the doped cuprates. Neat stuff.
- In the same session, my collaborator Silke Buehler-Paschen gave a nice discussion of ways to use heavy fermion materials to examine strange metals, looking beyond just resistivity measurements. Particularly interesting is the idea of trying to figure out quantum Fisher information, which in principle can tell you how entangled your many-body system is (that is, estimating how many other degrees of freedom are entangled with one particular degree of freedom). See here for an intro to the idea, and here for an implementation in a strange metal, Ce3Pd20Si6.