Search This Blog

Saturday, December 31, 2022

Favorite science fiction invention?

 In the forward-looking spirit of the New Year, it might be fun to get readers’ opinions of their favorite science fiction inventions.  I wrote about favorite sci-fi materials back in 2015, but let’s broaden the field. Personally, I’m a fan of the farcaster (spoiler warning!) from the Hyperion Cantos of Dan Simmons.  I also have a long-time affection for Larry Niven’s Known Space universe, especially the General Products Hull (a single molecule transparent to the visible, but opaque at all other wavelengths, and with binding somehow strengthened by an external fusion power source) and the Slaver Disintegrator, which somehow turns off the negative charge of the electron, and thus makes matter tear itself apart from the unscreened Coulomb repulsion of the protons in atomic nuclei.  Please comment below with your favorites.

On another new year’s note, someone needs to do a detailed study of the solubility limit of crème de cassis in champagne.  Too high a cassis to champagne ratio in your kir royals, and you end up with extra cassis stratified at the bottom of your flute, as shown here.


Happy new year to all, and best wishes for a great 2023.



Tuesday, December 27, 2022

The difficult need for creativity on demand

Thoughts at the end of another busy year…. Good science is a creative enterprise.  Some stereotypes paint most scientists as toiling away, so deeply constrained by logic that they function more like automatons grinding out the next incremental advance in a steady if slow march of progress. In practice, originality and creativity are necessary to develop and grow a research program.  Some of this is laid out (paradoxically, in a methodical list) by Carl Wieman in this article here.   Picking the right open questions to address (hopefully ones that are deep and interesting to other people as well as you), and figuring out how to address them given the tools at your disposal, frequently requires intuition, leaps beyond incrementalism, and some measure of intellectual risk-taking.

One aspect of modern science as practiced today with which I find myself struggling is the issue of time. We live in a short-term world.  Grants are generally brief in duration compared to doctoral student timescales and the time it takes to tackle big questions.  There are many more demands on our time than in the past, and it seems like most funding sources profess to want fresh, new, ground breaking ideas that are both high-risk/transformative/disruptive and yet somehow very likely to produce rapid, high-profile glossy results.  Some also want to see brand new approaches to education and outreach each time.  Finding the time to think deeply about the science and the educational aspects, reinventing research programs like clockwork, is something that I find very challenging.  One answer is, since creativity doesn’t generally respond to on-demand calls, always be thinking and noodling on ideas, but that’s much easier to say than to do consistently.   I’d be curious to hear others’ strategies for dealing with this; while I’m pretty set in how I work at this point, a discussion could be fun and useful.


Saturday, December 17, 2022

Brief items - LOC, GPT, etc.

 This year was a busy one and my overall posting rate is down.  Hopefully the coming year will be a bit less frenetic, but who knows.  A few brief items:

  • First, in the odd self-promotion department, this blog is officially going to be indexed by the Library of Congress as part of their Science Blogs Web Archive.  This is another sign that I am officially ancient in blogging terms.  This blog has never had the huge readership of some, but thanks to you for raising it above whatever the threshold of notice is for this.
  • This was a cute story.  Folks at ETH have shown that a thin, barely percolating layer of gold can act as an anti fogging coating on glasses, since it can locally heat up due to its infrared absorption, while still being sufficiently transparent for use in eyewear.  At the risk of costing myself a lucrative potential patent, it seems to me that you could do the same thing using TiN, which has similar near-IR optical properties, and should be easy to integrate with the TiO2 coating that the researchers already use.  
  • In a headline that is not a repeat from September, there is a new contender for world’s largest dilution refrigerator under construction, this time from FermiLab.  Multiple quantum computing platforms benefit from the sub-100 mK temperatures, so it’s not surprising to see efforts along these lines, but 5 cubic meters of sample chamber seems a bit much.  Time to invest in my Canadian 3He futures.
  • I’m glad to see that someone has been thinking like my sci-fi-loving brain, and working out whether gravitational wave detectors could be used to detect evidence of some types of interstellar spacecraft.  While the paper concerns conjectural accelerating planetary-mass ships, certainly exotic propulsion ideas (warp drives, wormholes) would also have gravitational radiation signatures.
  • Speaking of science fiction, like a lot of people I spent some time this week playing with chatGPT, the language model that may be the end of high school English essays, college admissions essays, and quite probably a lot of jobs.  Its output is uncanny and worrying, especially since it has no problem just brazenly lying and making up sources.  (For fun, see what happens if you ask it to explain why 51 is a prime number.). Still, it’s hard not to feel like we are right at some threshold, where expository and creative writing, historically the province of at least somewhat educated humans, is no longer ours.  This could mean great things for education (I asked it to explain Stokes’ theorem to me, and it did a pretty nice job), but it could mean terrible things for education (why learn to write well when a free tool can do a decent job for you?).  The calculator and computer did not eradicate math education or math literacy, so hopefully we will reap more of the positives than the negatives.  This post was written 100% by me, btw, with no GPT assistance, though remember that chatGPT lies….
  • At the risk of being deemed a dangerous website by Twitter, where I am here, I’m on a mastodon instance now as well.  Sciencemastodon.com was started by Charles Seife and hosts a bunch of scientists and science journalists.

Tuesday, December 13, 2022

The fusion story of the day

There is a press conference going on right now announcing a breakthrough at the National Ignition Facility at Livermore.   The NIF is an inertial confinement fusion facility that uses 192 laser beams to compress a fuel pellet containing deuterium and tritium.  The pellet is inside a gold hohlraum, and it's really the x-rays from the gold that do a lot of the heavy lifting in this experiment.  The claim is that the energy output from the D-T fusion (which comes in the form of energetic helium nuclei, 14 MeV neutrons, and x-rays) has now exceeded the energy input from the lasers.  That's clearly necessary if there is ever to be any hope of using this approach to generate actual electricity, but it is far from sufficient. 

There is some very interesting materials science at work throughout the project that bears on this.  Right now, the lasers used in the NIF are based on doped glass amplifiers, and those get very hot under use, so that there needs to be hours between shots.  Also, they basically rebuild the sample mounting for the hohlraum after each shot.  This is fine for proof-of-concept physics experiments, but it's very far from a workable power plant.  

This is an exciting time for fusion research, in that there is a fair bit of activity, including startups.  (Note also that some of these approaches are aiming for scales closer to US Navy sized, like 20 MWe, rather than city power which is more like 2 GWe.)   To give a sense of my age and the timescale for these projects, when I was an undergrad, I spent a summer doing heat transfer calculations for the cable-in-conduit conductors for the D magnets for ITER.  That was in 1992.  The cliché is that fusion is always 20 yrs away, but we should know considerably sooner than that whether the startup approaches are likely to get there.  

Saturday, December 03, 2022

The wormhole kerfuffle, ER=EPR, and all that

I was busy trying to finish off a grant proposal and paper revisions this week and didn't have the time to react in realtime to the PR onslaught surrounding the recent Nature paper by a team from Harvard, MIT, Fermilab, and Google.  There are many places to get caught up on this, but the short version:

  • Using their Sycamore processor, the experimentalists implemented a small-scale version of the SYK model.  This is a model that has many interesting properties, including the fact that it is a testbed for holography, in which a bulk system may be understood by the degrees of freedom on its boundary.  For an infinitely large SYK system, there is a duality to a 2D gravitational system.  So, a protocol for moving entanglement in the qubits that make up the SYK system is equivalent to having a traversable wormhole in that 2D gravitational system.  
  • The actual experiment is very cool.
  • The coverage in the press was extensive (Quanta, NY Times, e.g.).  There was a lot of controversy (see Peter Woit's blog for a summary, and Scott Aaronson for a good take) surrounding this, because there was some initial language usage that implied to a lay-person that the team had actually created a traversable wormhole.  Quanta revised their headline and qualified their language, to their credit.  
Rather than dogpiling on the media coverage, there are two main points at issue here that I think are worthy of discussion:
  1.  What do we mean when we say that we have experimentally implemented a model of a system?     When atomic physicists use ultracold fermionic atoms to make a 2D lattice governed by the Mott-Hubbard model (like here and here), we say that they have made a Mott insulator.  That same model is thought to be a good description of copper oxide superconductors.  However, no one would say that it actually is a copper oxide superconductor.  When is a model of a thing actually the thing itself?   This is at the heart of the whole topic of quantum simulation, but the issue comes up in classical systems as well.  My two cents:  If system A and system B are modeled extremely well by the same mathematics, that can give us real insights, but it doesn't mean that system A is system B.  Better language might be to say that system A is an analog to system B.  Physicists can be sloppy with language, and certainly it is much more attention-getting to editors of all stripes (be they journal editors or journalism editors) to have a short, punchy, bold description.  Still, it's better to be careful.  
  2. What do theorists like Lenny Susskind truly mean when they claim that entanglement is genuinely equivalent to wormholes?  This is summarized by the schematic equation ER = EPR, where ER = Einstein-Rosen wormhole and EPR = Einstein-Podolsky-Rosen entanglement.  I think I get the core intellectual idea that, in quantum gravity, spacetime itself may be emergent from underlying degrees of freedom that may be modeled as sorts of qubits; and that one can come up with fascinating thought experiments about what happens when dropping one member of an entangled pair of particles into the event horizon of a black hole.  That being said, as an experimentalist, the idea that any kind of quantum entanglement involves actual Planck-scale wormholes just seems bonkers.  That would imply that sending a photon through a nonlinear crystal and producing two lower energy entangled photons is actually creating a Planck-scale change in the topology of spacetime.  Perhaps someone in the comments can explain this to me.  Again, maybe this is me not understanding people who are being imprecise with their word choice.

Tuesday, November 22, 2022

The need for energy-efficient computing

Computing is consuming a large and ever-growing
fraction of the world's energy capacity.
I've seen the essential data in this figure several times over the last few months, and it has convinced me that the need for energy-efficient computing hardware is genuinely pressing.  This is from a report by the Semiconductor Research Corporation from 2020.  It argues that if computing needs continue to grow at the present rate, then by the early 2030s something like 10% of all of the world's energy production (and therefore something like 40% of the world's electricity production) will be tied up in computing hardware.  (ZIPs = \(10^21\) instructions per second)

Now, we all know the dangers of extrapolation.  Still, this trend tells us that something is going to change drastically - either the rate at which computing power grows will slow dramatically, or we will be compelled to find a much more energy-efficient computational approach, or some intermediate situation will develop.  (Note:  getting rid of crypto currencies sure wouldn't hurt, as they are incredibly energy-hungry and IMO have virtually zero positive contributions to the world, but that just slows the timeline.)

I've written before about neuromorphic computing as one approach to this problem.  Looking at neural nets as an architectural model is not crazy - your brain consumes about 12 W of power continuously, but it is far better at certain tasks (e.g. identifying cat pictures) than much more power-hungry setups.  Here is a nice article from Quanta on this, referencing a recent Nature paper.  Any big change will likely require the adoption of new materials and therefore new manufacturing processes.  Just something to bear in mind when people ask why anyone is studying the physics of electronic materials.

Saturday, November 12, 2022

Bob Curl - it is possible to be successful and also a good person

I went to a memorial service today at Rice for my late colleague Bob Curl, who died this past summer, and it was a really nice event.  I met Bob almost immediately upon my arrival at Rice back in 2000 (though I’d heard about him from my thesis advisor, who’d met him at the Nobel festivities in Stockholm in 1996).  As everyone who interacted with him for any length of time will tell you, he was simultaneously extremely smart and amazingly nice.  He was very welcoming to me, even though I was a new assistant professor not even in his department.  I’d see him at informal weekly lunch gatherings of some folks from what was then called the Rice Quantum Institute, and he was always interested in learning about what his colleagues were working on - he had a deep curiosity and an uncanny ability to ask insightful questions.  He was generous with his time and always concerned about students and the well-being of the university community.

A refrain that came up over and over at the service was that Bob listened.  He talked with you, not at you, whether you were an undergrad, a grad student, a postdoc, a professor, or a staff member.  I didn’t know him nearly as well as others, but in 22 years I never heard him say a cross word or treat anyone with less than respect.  

His insatiable curiosity also came up repeatedly.  He kept learning new topics, right up to the end, and actually coauthored papers on economics, like this one.  By all accounts he was scientifically careful and rigorous.

Bob was a great example of how it is possible to be successful as an academic and a scientist while still being a nice person.  It’s important to be reminded of that sometimes.

Saturday, November 05, 2022

The 2022 Welch Conference

The last couple of weeks have been very full.  

One event was the annual Welch Foundation conference (program here).  The program chair for this one was W. E. Moerner, expert (and Nobel Laureate) on single-molecule spectroscopy, and it was really a great meeting.  I'm not just saying that because it's the first one in several years that was well aligned to my own research.  

The talks were all very good, and I was particularly impressed by the presentation by Yoav Shechtman, who spoke about the use of machine learning in super-resolution microscopy.  It basically had me convinced that machine learning (ML) can, under the right circumstances, basically be magic.   The key topic is discussed in this paper.  The basic idea of some flavors of super-resolution microscopy is to rely on the idea that fluorescence is coming from individual, hopefully well-separated single emitters.  Diffraction limits the size of a spot, but if you know that the light is coming from one emitter, you can use statistics to figure out the x-y centroid position of that spot to much higher precision.  That can be improved by ML methods, but there's more.  There are ways to get z information as well.  Xiaowei Zhuang's group had this paper in 2008 that's been cited 2000+ times, using a clever idea:  with a cylindrical lens in the beam path, a spot from an emitter above the focal plane is distorted along one axis, while a spot from an emitter below the focal plane is distorted along the orthogonal axis.  In the new work, Shechtman's folks have gone further, putting a phase mask into the path that produces more interesting distortions along those lines.  They use ML trained on a detailed simulation of their microscope data to get improved z precision.  Moreover, they also can use ML to then design an optimal version of that phase mask, to get even better precision.  Very impressive.

The other talk that really stuck out was the Welch award talk by Carolyn Bertozzi, one of this year's Nobel Laureates in Chemistry.  She gave a great presentation about the history of bioorthogonal chemistry, and it was genuinely inspiring, especially given the clinical treatment possibilities it's opened up.  Even though she must've given some version of that talk hundreds of times, her passion and excitement about the actual chemistry (e.g. see, these bonds here are really strained, so we know that the reaction has to happen here) was just palpable.  

Wednesday, October 26, 2022

Rice University Academy of Fellows postdoc opportunity, 2023

As I have posted in previous years, Rice has a university-wide endowed honorific postdoctoral program called the Rice Academy of Fellows.   Like all such things, it's very competitive. The new application listing has gone live here with a deadline of January 4, 2023. Applicants have to have a faculty mentor, so in case someone is interested in working with me on this, please contact me via email. We've got some fun, exciting stuff going on!

Sunday, October 16, 2022

Materials labs of the future + cost

The NSF Division of Materials Research has been soliciting input from the community about both the biggest outstanding problems in condensed matter and materials science, and the future of materials labs - what kind of infrastructure, training, etc. will be needed to address those big problems.  In thinking about this, I want to throw out a stretch idea.  

I think it would have transformative impact on materials research and workforce development if there were fabrication and characterization tools that offered great performance at far lower prices than currently possible.  I'd mentioned the idea of developing a super-cheap SEM a while ago. I definitely worry that we are approaching a funding situation where the separation between top universities and everyone else will continue to widen rapidly.  The model of a network of user facilities seems to be how things have been trending (e.g. go to Harvard and use their high-res TEM, if your institution can't afford one).  However, if we really want to move the needle on access and training for a large, highly diverse workforce, it would be incredible to find a way to bring more capabilities to the broadest sweep of universities.   Maybe it's worth thinking hard about what could be possible to radically reduce hardware costs for the suite of materials characterization techniques that would be most important.


Saturday, October 08, 2022

Getting light out of plasmonic tunnel junctions - the sequel

A couple of years ago I wrote about our work on "above threshold" light emission in planar metal tunnel junctions.  In that work, we showed that in a planar tunnel junction, you can apply a bias voltage \(V\) and get lots of photons out at energies quite a bit greater than \(\hbar \omega = eV\).  In the high current regime when there are strong local plasmon resonances, it is possible to drive (steady state) some part of the electronic distribution to very high effective electron temperatures, and then observe radiation from the recombination of those hot carriers.  One neat thing about this is that by analyzing the spectra, it is possible to back out the actual plasmon-modified density of photonic states for emission to the far-field, \(\rho(\hbar \omega)\) of a particular junction.   

In our new paper published this week, we have been able to take this quite a bit further.  In the low current regime with weaker local plasmon resonances, the energy deposited by tunneling electrons is able to diffuse away rapidly compared to the arrival of more carriers, so that kind of carrier heating above isn't important.  Instead, it's been known for a while that the right way to think about light emission in that case is as a process connected to fluctuations (shot noise) in the tunneling current, as demonstrated very prettily here.  Within that mechanism, it should be possible to predict with precision what the actual emission spectrum should look like, given the tunneling conductance, the bias voltage, and \(\rho(\hbar \omega)\).  As shown in the figure, we can now test this, and it works very well.  Take a planar aluminum tunnel junction made by electromigration, and in the high conductance/high current limit, use the hot carrier emission to determine \(\rho(\hbar \omega)\).  Then gently migrate the junction further to lower the conductance and fall out of the hot carrier emission regime.  Using the measured conductance and the previously found \(\rho(\hbar \omega)\), the theory (dashed lines in the right panel) agrees extremely well with the measured spectra (colored data points) with only two adjustable parameters (an overall prefactor, and a slightly elevated electronic temperature that gets the rounding of the emission at the \(eV\) cutoff, indicated by the arrows in the right panel).  

I think this agreement is pretty darn impressive.  It confirms that we have a quantitative understanding of how shot noise (due to the discreteness of charge!) affects light emission processes all the way up to optical frequencies.



Friday, September 30, 2022

Rice University physics faculty searches in quantum

The Department of Physics and Astronomy at Rice University invites applications for a tenure-track faculty position in the area of experimental quantum science using atomic, molecular, or optical methods.  This encompasses quantum information processing, quantum sensing, quantum communication, quantum opto-mechanics, quantum many-body physics, and quantum simulation conducted on a variety of platforms. We seek outstanding scientists whose research will complement and extend existing quantum activities within the Department and across the University (Rice Quantum Initiative: https://quantum.rice.edu/). In addition to developing an independent and vigorous research program, the successful candidates will be expected to teach, on average, one undergraduate or graduate course each semester, and contribute to the service missions of the Department and University. The Department anticipates making an appointment at the assistant professor level. A Ph.D. in physics or related field is required by December 31, 2022.

Applications for this position must be submitted electronically at apply.interfolio.com/114465. Applicants will be required to submit the following: (1) cover letter; (2) curriculum vitae; (3) statement of research; (4) statement on teaching; (5) statement on diversity, mentoring, and outreach; (6) PDF copies of up to three publications; and (7) the names, affiliations, and email addresses of three professional references. Rice University, and the Department of Physics and Astronomy, are strongly committed to a culturally diverse intellectual community. In this spirit, we particularly welcome applications from all genders and members of historically underrepresented groups who exemplify diverse cultural experiences and who are especially qualified to mentor and advise all members of our diverse student population. We will begin reviewing applications by November 15, 2022. To receive full consideration, all application materials must be received by January 1, 2023. The expected appointment date is July, 2023.

____________

The Department of Physics and Astronomy at Rice University invites applications for a tenure-track faculty position in the area of theoretical quantum science using atomic, molecular, or optical methods.  This encompasses quantum information processing, quantum sensing, quantum communication, quantum opto-mechanics, quantum many-body physics, and quantum simulation conducted on a variety of platforms. The ideal theorist will intellectually connect AMO physics to topics in condensed matter and quantum information theory. We seek outstanding scientists whose research will complement and extend existing quantum activities within the Department and across the University (Rice Quantum Initiative: https://quantum.rice.edu/). In addition to developing an independent and vigorous research program, the successful candidates will be expected to teach, on average, one undergraduate or graduate course each semester, and contribute to the service missions of the Department and University. The Department anticipates making an appointment at the assistant professor level. A Ph.D. in physics or related field is required by December 31, 2022.

Applications for this position must be submitted electronically at apply.interfolio.com/114467. Applicants will be required to submit the following: (1) cover letter; (2) curriculum vitae; (3) statement of research; (4) statement on teaching; (5) statement on diversity, mentoring, and outreach; (6) PDF copies of up to three publications; and (7) the names, affiliations, and email addresses of three professional references. Rice University, and the Department of Physics and Astronomy, are strongly committed to a culturally diverse intellectual community. In this spirit, we particularly welcome applications from all genders and members of historically underrepresented groups who exemplify diverse cultural experiences and who are especially qualified to mentor and advise all members of our diverse student population. We will begin reviewing applications by November 15, 2022. To receive full consideration, all application materials must be received by January 1, 2023. The expected appointment date is July, 2023.

Rice University is an Equal Opportunity Employer with commitment to diversity at all levels, and considers for employment qualified applicants without regard to race, color, religion, age, sex, sexual orientation, gender identity, national or ethnic origin, genetic information, disability or protected veteran status.

Wednesday, September 28, 2022

News items, Nobel speculation

 Some news items of interest:

  • Three weeks old now, but this story about IBM cooling down their enormous dilution refrigerator setup got my attention (as someone with ultralow temperature scientific roots).  IBM did this to demonstrate that this kind of large-scale cooling is possible, since it may be necessary for some implementations (whether superconducting or spin-based) of quantum computing.  To give a sense of scale, Oxford Instruments used to rate their dilution refrigerators based on their cooling power at 100 mK (how much heat could you dump into the system and still have it maintain a steady 100 mK temperature).  The system I worked on in grad school was pretty large, a model 400, meaning it had 400 microwatts of cooling power at 100 mK.  The new IBM setup can handle six dilution refrigerator units with a total cooling power of 10 mW (25 times more cooling capacity) at 100 mK, and with plenty of room for lots of electronic hardware.  Dil fridges are somewhat miraculous, in that they give access to temperatures far below what is readily available in nature thanks to the peculiarities of the 3He/4He mixture phase diagram. 
  • This retraction and the related news article are quite noteworthy.  The claim of room temperature superconductivity in carbon-containing hydrogen-rich material at very high pressures (written about here) has been retracted by the editors of Nature over the objection of the authors.    The big issue, as pointed out by Hirsch and van der Marel, is about the magnetic susceptibility data, the subtraction of a temperature-dependent background, and concerns whether this was done incorrectly (or misleadingly/fraudulently).  
  • Can we all agree, after looking at images like the one in this release, that STM and CO-functionalized-tip AFM are truly amazing techniques that show molecules really do look like chemistry structural diagrams from high school?
  • Quanta magazine has a characteristically excellent article about patterns arising when curved elastic surfaces are squished flat.  
  • They also have an article about this nice experiment (which I have not read in detail).  I need to look at this further, but it's a safe bet to say that many will disagree with the claim (implied by the article headline) that this has now solved the high temperature superconductivity problem to completion.  
And it's that time of year again to speculate about the Nobel prizes.  It would seem that condensed matter is probably due, given the cyclic history of the physics prize.  There are many candidates that have been put forward in previous years (topological insulators; metamaterials; qc lasers; twisted materials; multiferroics; anyons; my always-wrong favorite of geometric phases) as well as quantum optics (Bell's Inequalities).  I suspect the odds-on favorite for the medicine prize would be mRNA-based vaccines, but I don't know the field at all.  Feel free to gossip in the comments.

Friday, September 16, 2022

Surprising spin transport in insulating VO2

Monoclinic VO2,
adapted from here

As I wrote last year, techniques have been developed in the last decade or two that use the inverse spin Hall effect as a tool for measuring the transport of angular momentum in insulators.  We just applied this approach to look at the thermally driven transport of spin-carrying excitations (the spin Seebeck effect) in thin films of vanadium dioxide at low temperatures.  VO2 is a strongly correlated transition metal oxide that has a transition at 65C between a high temperature (rutile structure) metallic state with 1D vanadium chains, and a low temperature (monoclinic structure) insulating state in which the vanadium atoms have formed dimers, as shown at right.  I circled one V-V dimer in purple.  

The expectation, going back almost 50 years, is that in each dimer the unpaired d electrons on the vanadium atoms form a singlet, and thus the insulating state should be magnetically very boring.  That's why the result of our recently published paper are surprising.  In the "nonlocal" geometry (with a heater wire and a detector wire separated laterally on the surface of a VO2 film), we see a clear spin Seebeck signal that increases at temperatures below 30-40 K, indicating that some spin-carrying excitations are being thermally excited and diffusing from the heater to the detector.   One natural suspect would be thermally activated triplet excitations called triplons, and we are continuing to take data in other geometries and to try to nail down whether that is what is happening here.  

This has been a fun project, in part because I get a real kick out of the fact that this measurement technique is so simple and first-year undergrad physics says that you should see nothing.  We are running ac current back and forth in one wire at a few Hz, and measuring the voltage across a neighboring wire at twice that frequency, on an insulating substrate.  Instead of seeing nothing, because of the hidden action of spin in the insulator and spin-orbit physics in the wires, we see a clear signal that depends nontrivially on magnetic field magnitude and direction as well as temperature.  Gets me every time.  

Monday, September 05, 2022

Coming next month....


I'm going to be presenting a continuing education course starting next month, trying to give a general audience introduction to some key ideas about condensed matter and materials physics.   

From the intro flyer:  "The world and the materials that compose it are full of profound phenomena often overlooked by the uninitiated, from quasiparticles to the quantum world. Did you know that there are hundreds of states of matter? Have you ever wondered why objects can’t pass through each other and why stars don’t collapse? What do sports fans doing the wave or a traffic slowdown on the 610 Loop have to do with electrical conduction in metal? Why are raindrops wet and how do snowflakes achieve their delicate sixfold symmetry? Learn how physics affects everything around you, defining the very laws of nature. Spanning physics, chemistry, materials science, electrical engineering and even a bit of biology, this course brings the foundations of everyday physics to life and shares some of the most intriguing research emerging today."

Here is the link for registration for the course.  (The QR code I'd originally posted seems to point to the wrong class.)

(My posting has been less frequent as I continue to work on preparing this class.  Should be fun.)
  

Tuesday, August 23, 2022

A couple of quantum info papers

Surfacing from being submerged by other responsibilities (including jury duty today), I wanted to point out two paper.

This preprint (based on a talk at the Solvay Conference - the 2022 one, not the 1927 one) from John Preskill provides a nice overview of quantum information, at a very accessible level for non-experts.  It’s nice seeing this perspective.

This paper is much more technical and nitty-gritty.  I’ve written before about my curiosity concerning how  good the performance would be of quantum computing devices made with the process control and precision of modern state-of-the-art semiconductor fab.  Here is a look at superconducting qubits made with a CMOS compatible process, using ebeam lithography and other steps developed for 300 mm wafer fabrication.  

More soon….


Saturday, August 06, 2022

Brief items - talks, CHIPS, and a little reading

An administrative role I've taken on for the last month that will run through September has been eating quite a bit of my time, but I wanted to point out some interesting items:

Sunday, July 31, 2022

Indistinguishability

In thinking about presenting physics to a lay audience, I think we haven't sufficiently emphasized what we mean by particles or objects being "indistinguishable" and everything that touches in modern physics.

The vernacular meaning of "indistinguishable" is clear.  Two objects are indistinguishable if, when examined, they are identical in every way - there is no measurement or test you could do that would allow you to tell the difference between them.  Large objects can only approach this.  Two identical cue balls made at a billiard ball factory might be extremely similar (white, spherical, smooth, very close to the same diameter), but with sufficient tools you could find differences that would allow you to label the balls and tell them apart.  In the limit of small systems, though, as far as we know it is possible to have objects that are truly indistinguishable.  Two hydrogen atoms, for example, are believed to be truly indistinguishable, with exactly the same mass, exactly the same optical and magnetic properties, etc.  

This idea of indistinguishable particles has profound consequences.  In statistical mechanics, entropy is commonly given by \(k_{\mathrm{B}}\ln \Omega\), where \(k_{\mathrm{B}}\) is Boltzmann's constant, and \(\Omega\) is the number of microscopic ways to arrange a system.  If particles are indistinguishable, this greatly affects our counting of configurations.  (A classical example of this issue involves mixing entropy and the Gibbs Paradox.)  

Indistinguishability and quantum processes deeply bothered the founders of quantum theory.  For example, take a bunch of hydrogen atoms all in the 2p excited state.  Any measurement you could do on those atoms would show them to have exactly the same properties, without any way to distinguish the first atom from, say, the fifth atom selected.  Left alone, each of those atoms will decay to the 1s ground state and spit out a photon, but they will do so in a random order at random times, as far as we can tell.  We can talk about the average lifetime of the 2p state, but there doesn't seem to be any way to identify some internal clock that tells each excited atom when to decay, even though the atoms are indistinguishable. 

It gets more unsettling.  Any two electrons are indistinguishable.  So, "common sense" says that swapping any two electrons should get you back to a state that is the same as the initial situation.  However, electrons are fermions and follow Fermi-Dirac statistics.  When swapping any two electrons, the quantum mechanical state, the mathematical object describing the whole system, has to pick up a factor of -1.   Even weirder, there can be interacting many-body situations when swapping nominally indistinguishable particles takes the system to an entirely different state (non-Abelian anyons).  The indistinguishability of electrons has prompted radical ideas in the past, like Wheeler suggesting that there really is only one electron.

TL/DR:  Indistinguishability of particles is intuitively weird, especially in the context of quantum mechanics.

Wednesday, July 13, 2022

A (very late) book review: The Cold Wars

On a recommendation, I just finished reading The Cold Wars:  A History of Superconductivity, by Jean Matricon and Georges Waysand.  This book was originally published in French in 1994, and then translated in English in 2003.  Unfortunately, the book appears to be out of print, and I was fortunate enough to pick up a used hardcover copy.

Beginning with the race to achieve very low temperatures through liquefaction of helium, this work tells a pretty compelling story of the discovery and understanding of superconductivity and superfluidity, ending in the comparatively early days of the high Tc cuprates.  Along the way, the authors introduce a cast of varyingly compelling characters and some fascinating stories that aren't all well-known.  I knew that Landau had been sent to a gulag for a time; I did not know that Kapitsa wrote personally to Stalin to try to argue that this had to be a mistake and was probably a consequence of Landau being very abrasive to the wrong people more than any actual counterrevolutionary beliefs.  (Spoiler:  Landau was abrasive, but he did also sign onto a letter that slammed Stalin hard.)  I did not know that Bardeen and Schrieffer were most concerned about Feynman possibly scooping them thanks to the latter's brilliance and expertise in both superfluidity and diagrammatic methods.  The story is also in there about the initial papers on YBCO and how the chemical formula was "accidentally" wrong in the submitted manuscript, corrected only once the paper was in proofs.

The authors also do a good job of conveying the ebb and flow of science - from the sudden onset of a fashionable topic, to the transition toward more detail and a greater focus on applications.  The social dimensions come through as well, with the coming and going of particular great centers of excellence in the research, and approaches to "big" science.    

The style is accessible, and throughout there are indented sidebars meant to provide scientific context for those readers who are probably not from a physics background.  If you track down a copy, I definitely recommend reading it if you're interested in the topic and the history of this part of condensed matter physics.

Friday, July 08, 2022

More about the costs of doing research

This isn't physics, but it's still something that might be of interest to some readers.  There is still a great deal of mystery among many about how university research is funded and supported.  Five years ago I wrote a bit about "indirect costs", more properly called facilities and administrative costs (F&A).  I recently had the opportunity to learn more about this, and I came across a very helpful document from the Council on Governmental Relations.  COGR is an association of organizations (universities, med centers, research institutes) who help to advocate to government policymakers about research administration and finances.

The document explains a lot about the history of how research finances are handled in the US.  One thing that struck me as I was reading this is the fact that the actual administrative costs that can be charged to grants (which pay for things like running the grants and contracts offices and the accountants who track the money) has been capped only for universities at 26% since 1991, even though there have been more and more reporting and administrative requirements placed on universities ever since.   (If you want to know what happened in 1991 that led to this cap, you can read this [that might be paywalled] or this wiki article whose detailed accuracy I cannot confirm.)

As I wrote before, vice provosts/presidents/chancellors for research at major US universities would be happy to explain at length that F&A cost recovery doesn't come close to covering the actual costs associated with doing university-based research.  ("Universities lose money doing research.")  Obviously this is an oversimplification - if research was truly a large net financial negative, universities wouldn't do it.  Successful research universities accrue benefits from research in terms of stature and reputation that are worth enough to make the enterprise worthwhile.  Of course, the danger is that the balance will shift enough that only the wealthiest, most prestigious universities will be able to afford groundbreaking research in expensive fields (typically the physical sciences and engineering).  

Anyway, if you want to understand the issues better, I encourage reading that document.  I'll write more about physics soon.

Friday, June 24, 2022

Implementing a model of polyacetylene

An impressive paper was just published in Nature, in which atomically precisely fabricated structures in Si were used as an analog model of a very famous problem in physics, the topological transition in trans-polyacetylene. 

Actual trans-polyacetylene is an aromatic organic chain molecule, consisting of sp2 hybridized carbons, as shown.  This is an interesting system, because you could imagine swapping the C-C and C=C bonds, and having domains where the (bottom-left to top-right) links are double bonds, and other domains where the (top-left to bottom-right) links are double bonds.  The boundaries between domains are topological defects ("solitons").  As was shown by Su, Schrieffer, and Heeger, these defects are spread out over a few bonds, are energetically cheap to form, and are mobile.  

(Adapted from Fig 1 here)
The Su-Schrieffer-Heeger model is a famous example of a model that shows a topological transition.  Label site-to-site hopping along those two bond directions as \(v\) and \(w\).  If you have a finite chain, as shown here, and \(v > w\), there are no special states at the ends of the chain.  However, \(v < w\) for the system as shown, it is favorable to nucleate two "surface states" at the chain ends, with the topological transition happening at \(v = w\).  

The new paper that's just been published takes advantage of the technical capabilities developed over the last two decades by the team of Michelle Simmons at UNSW.  I have written about this approach here.  They have developed and refined the ability to place individual phosphorus dopant atoms on Si with near-atomic precision, leading them to be able to fabricate "dots" (doped islands) and gate electrodes, and then wire these up and characterize them electrically.  The authors made two devices, each  a chain of islands analogous to the C atoms, and most importantly were able to use gate electrodes to tune the charge population on the islands.  One device was designed to be in the topologically trivial limit, and the other (when population-tuned) in the limit with topological end states.  Using electronic transport, they could perform spectroscopy and confirm that the energy level structure agrees with expectations for these two cases.

(Adapted from Fig 2 here)

This is quite a technical accomplishment.  Sure, we "knew" what should happen, but the level of control demonstrated in the fabrication and measurement are very impressive.  These bode well for the future of using these tools to implement analog quantum simulators for more complicated, much harder to solve many-body systems.  

Sunday, June 12, 2022

Quasiparticles and what is "real"

This week a paper was published in Nature about the observation via Raman scattering of a particular excitation in the charge density wave materials RTe3 (R = La, Gd) that is mathematically an example of an "amplitude mode" that carries angular momentum that the authors identify as an axial Higgs mode.  (I'm not going to get into the detailed physics of this.)

The coverage of this paper elicited a kerfuffle on blogs (e.g here and here) for two main reasons that I can discern.  First, there is disagreement in the community about whether calling a mode like this "Higgs" is appropriate, given the lack of a gauge field in this system (this is in the comments on the second blog posting).  That has become practice in the literature, but there are those who strongly disapprove.  Second, some people are upset because some of the press coverage of the paper, with references to dark matter, hyped up the result to make it sound like this was a particle physics discovery, or at least has implications for particle physics. 

This does give me the opportunity, though, to talk about an implication that I see sometimes from our high energy colleagues in discussions of condensed matter, that "quasiparticles" are somehow not "real" in the way of elementary particles.  

What are quasiparticles?  In systems with many degrees of freedom built out of large numbers of constituents, amazingly it is often possible to look at the low energy excitations above the ground state and find that those excitations look particle-like - that is, there are discrete excitations that, e.g., carry (crystal) momentum \(\hbar \mathbf{k}\), have an energy that depends on the momentum in a clear way \(\epsilon(\mathbf{k})\), and also carry spin, charge, etc.  These excitations are "long lived" in the sense that they propagate many of their wavelengths (\(2 \pi/|\mathbf{k}|\)) before scattering and have lifetimes \(\tau\) such that their uncertainty in energy is small compared to their energy above the ground state, (\(\hbar/\tau << \epsilon(\mathbf{k})\)).  The energy of the many-body system can be well approximated as the sum of the quasiparticle excitations:  \(E \approx \Sigma n(\mathbf{k})\epsilon(\mathbf{k})\).  

There are many kinds of quasiparticles in condensed matter systems.  There are the basic ones like (quasi)electrons and (quasi)holes in metals and semiconductors, phonons, magnons, polarons, plasmons, etc.  While it is true that quasiparticles are inherently tied to their host medium, these excitations are "real" in all practical ways - they can be detected experimentally and their properties measured.  Indeed, I would argue that it's pretty incredible that complicated, many-body interacting systems so often host excitations that look so particle-like.  That doesn't seem at all obvious to me a priori.  

What has also become clear over the last couple of decades is that condensed matter systems can (at least in principle) play host to quasiparticles that act mathematically like a variety of ideas that have been proposed over the years in the particle physics world.  You want quasiparticles that mathematically look like massless fermions described by the Dirac equationGraphene can do that.  You want more exotic quasiparticles described by the Weyl equationTaAs can do that.  You want Majorana fermions?  These are expected to be possible, though challenging to distinguish unambiguously.  Remember, the Higgs mechanism started out in superconductors, and the fractional quantum Hall system supports fractionally charged quasiparticles.  (For a while it seemed like there was a cottage industry on the part of a couple of teams out there:  Identify a weird dispersion relation \(\epsilon(\mathbf{k})\) predicted in some other context; find a candidate material whose quasiparticles might show this according to modeling; take ARPES data and publish on the cover of a glossy journal.)

Why are quasiparticles present in condensed matter, and why to they "look like" some models of elementary particles?  Fundamentally, both crystalline solids and free space can be usefully described using the language of quantum field theory.  Crystalline solids have lower symmetry than free space (e.g. the lattice gives discrete rather than continuous translational symmetry), but the mathematical tools at work are closely related.  As Bob Laughlin pointed out in his book, given that quasiparticles in condensed matter can be described in very particle-like terms and can even show fractional charge, maybe its worth wondering whether everything is in a sense quasiparticles.  


Saturday, May 28, 2022

Brief items - reviews, videos, history

Here are some links from the past week:

  • I spent a big portion of this week attending Spin Caloritronics XI at scenic UIUC, for my first in-person workshop in three years.  (The APS March Meeting this year was my first conference since 2019.)  It was fun and a great way to get to meet and hear from experts in a field where I'm a relative newbie.  While zoom and recorded talks have many upsides, the in-person environment is still tough to beat when the meeting is not too huge.  
  • Topical to the meeting, this review came out on the arxiv this week, all about the spin Seebeck effect and how the thermally driven transport of angular momentum in magnetic insulators can give insights into all sorts of systems, including ones with exotic spin-carrying excitations.
  • Another article on a topic near to my heart is this new review (to appear in Science) about strange metals.  It makes clear the distinction between strange and bad metals and gives a good sense of why these systems are interesting.
  • On to videos.  While at the meeting, Fahad Mahmood introduced me to this outreach video, by and about women in condensed matter at UIUC.
  • On a completely unrelated note, I came across this short film from 1937 explaining how differential steering works in cars.  This video is apparently well known in car enthusiast circles, but it was new to me, and its clarity was impressive.  
  • Finally, here is the recording of the science communication symposium that I'd mentioned.  The keynote talk about covid by Peter Hotez starts at 1h49m, and it's really good. 
  • In terms of history (albeit not condensed matter), this article (written by the founding chair) describes the establishment of the first (anywhere) Space Science department, at Rice University,  In 1999 the SPAC department merged with Physics to become the Department of Physics and Astronomy, where I've been since 2000.  

Sunday, May 15, 2022

Flat bands: Why you might care, and one way to get them

When physicists talk about the electronic properties of solids, we often talk about "band theory".  I've written a bit about this before here.  In classical mechanics, a free particle of mass \(m\) and momentum \(\mathbf{p}\) has a kinetic energy given by \(p^2/2m\).  In a crystalline solid, we can define a parameter, the crystal momentum, \(\hbar \mathbf{k}\), that acts a lot like momentum (accounting for the ability to transfer momentum to and from the whole lattice).  The energy near the top or bottom of a band is often described by an effective mass \(m_{*}\), so that \(E(\mathbf{k}) = E_{0} + (\hbar^2 k^2/2m_{*})\).  The whole energy band spans some range of energies called the bandwidth, \(\Delta\). If a band is "flat", that means that its energy is independent of \(\mathbf{k}\) and \(\Delta = 0\).  In the language above, that would imply an infinite effective mass; in a semiclassical picture, that implies zero velocity - the electrons are "localized", stuck around particular spatial locations.  

Why is this an interesting situation?  Well, the typical band picture basically ignores electron-electron interactions - the assumption is that the interaction energy scale is small compared to \(\Delta\).  If there is a flat band, then interactions can become the dominant physics, leading potentially to all kinds of interesting physics, like magnetism, superconductivity, etc.  There has been enormous excitement in the last few years about this because twisting adjacent layers of atomically thin materials like graphene by the right amount can lead to flat bands and does go along with a ton of cool phenomena.  

How else can you get a flat band?  Quantum interference is one way.  When worrying about quantum interference in electron motion, you have to add the complex amplitudes for different electronic trajectories.  This is what gives you the interference pattern in the two-slit experiment.   When trajectories to a certain position interfere destructively, the electron can't end up there.  

It turns out that destructive interference can come about from lattice symmetry. Shown in the figure is a panel adapted from this paper, a snapshot of part of a 2D kagome lattice.  For the labeled hexagon of atoms there, you can think of that rather like the carbon atoms in benzene, and it turns out that there are states such that the electrons tend to be localized to that hexagon.  Within a Wannier framework, the amplitudes for an electron to hop from the + and - labeled sites to the nearest (red) site are equal in magnitude but opposite in sign.  So, hopping out of the hexagon does not happen, due to destructive interference of the two trajectories (one from the + site, and one from the - site).  

Of course, if the flat band is empty, or if the flat band is buried deep down among the completely occupied electronic states, that's not likely to have readily observable consequences.  The situation is much more interesting if the flat band is near the Fermi level, the border between filled and empty electronic states.  Happily, this does seem to happen - one example is Ni3In, as discussed here showing "strange metal" response; another example is the (semiconducting?) system Nb3Cl8, described here.  These flat bands are one reason why there is a lot of interest these days in "kagome metals".

Saturday, May 14, 2022

Grad students mentoring grad students - best practices?

I'm working on a physics post about flat bands, but in the meantime I thought I would appeal to the greater community.  Our physics and astronomy graduate student association is spinning up a mentoring program, wherein senior grad students will mentor beginning grad students.  It would be interesting to get a sense of best practices in this.  Do any readers have recommendations for resources about this kind of mentoring, or examples of departments that do this particularly well?  I'm aware of the program at UCI and the one at WUSTL, for example.

Sunday, May 01, 2022

The multiverse, everywhere, all at once

The multiverse (in a cartoonish version of the many-words interpretation of quantum mechanics sense - see here for a more in-depth writeup) is having a really good year.  There's all the Marvel properties (Spider-Man: No Way Home; Loki, with its Time Variance Authority; and this week's debut of Doctor Strange in the Multiverse of Madness), and the absolutely wonderful film Everything, Everywhere, All at Once, which I wholeheartedly recommend.  

While it's fun to imagine alternate timelines, the actual many-worlds interpretation of quantum mechanics (MWI) is considerably more complicated than that, as outlined in the wiki link above.  The basic idea is that the apparent "collapse of the wavefunction" upon a measurement is a misleading way to think about quantum mechanics.  Prepare an electron so that its spin is aligned along the \(+x\) direction, and then measure \(s_{z}\).  The Copenhagen interpretation of quantum would say that prior to the measurement, the spin is in a superposition of \(s_{z} = +1/2\) and \(s_{z}=-1/2\), with equal amplitudes.  Once the measurement is completed, the system (discontinuously) ends up in a definite state of \(s_{z}\), either up or down.  If you started with an ensemble of identically prepared systems, you'd find up or down with 50/50 probability once you looked at the measurement results.    

The MWI assumes that all time evolution of quantum systems is (in the non-relativistic limit) governed by the Schrödinger equation, period.  There is no sudden discontinuity in the time evolution of a quantum system due to measurement.  Rather, at times after the measurement, the spin up and spin down results both occur, and there are observers who (measured spin up, and \(s_{z}\) is now +1/2) and observers who (measured spin down, and \(s_{z}\) is now -1/2).  Voila, we no longer have to think about any discontinuous time evolution of a quantum state; of course, we have the small issues that (1) the universe becomes truly enormously huge, since it would have to encompass this idea that all these different branches/terms in the universal superposition "exist", and (2) there is apparently no way to tell experimentally whether that is actually the case, or whether it is just a way to think about things that makes some people feel more comfortable.  (Note, too, that exactly how the Born rule for probabilities arises and what it means in the MWI is not simple.) 

I'm not overly fond of the cartoony version of MWI.  As mentioned in point (2), there doesn't seem to be an experimental way to distinguish MWI from many other interpretations anyway, so maybe I shouldn't care.  I like Zurek's ideas quite a bit, but I freely admit that I have not had time to sit down and think deeply about this (I'm not alone in that.).  That being said, lately I've been idly wondering if the objection of the "truly enormously huge" MWI multiverse is well-founded beyond an emotional level.  I mean, as a modern physicist, I already have come to accept (because of observational evidence) that the universe is huge, possibly infinite in spatial extent, appears to have erupted into an inflationary phase 13.6 billion years ago from an incredibly dense starting point, and contains incredibly rich structure that only represents 5% of the total mass of everything, etc.  I've also come to accept that quantum mechanics makes decidedly unintuitive predictions about reality that are borne out by experiment.  Maybe I should get over being squeamish about the MWI need for a zillion-dimensional hilbert space multiverse.  As xkcd once said, the Drake Equation should include a factor for "amount of bullshit you're willing to buy from Frank Drake".  Why should MWI's overhead be a bridge too far?  

It's certainly fun to speculate idly about roads not taken.  I recommend this thought-provoking short story by Larry Niven about this, which struck my physics imagination back when I was in high school.  Perhaps there's a branch of the multiverse where my readership is vast :-)



Monday, April 25, 2022

Science Communications Symposium

 I will be posting more about science very soon, but today I'm participating in a science communications symposium here in the Wiess School of Natural Sciences at Rice.  It's a lot of fun and it's great to hear from some amazing colleagues who do impressive work.   For example, Lesa Tran Lu and her work on the chemistry of cooking, Julian West and his compelling scientific story-telling, Scott Solomon and his writing about evolution, and Kirsten Siebach and her work on Mars rovers and geology.

(On a side note, I've now been blogging for almost 17 years - that makes me almost 119 blog-years old.)

UPDATE:  Here is a link to a video of the whole symposium.


Friday, April 08, 2022

Brief items

It's been a while since the APS meeting, with many things going on that have made catching up here a challenge.  Here are some recent items that I wanted to point out:

  • Igor Mazin had a very pointed letter to the editor in Nature last week, which is rather ironic since much of what he was excoriating is the scientific publishing culture promulgated by Nature.  His main point is that reaching for often-unjustified exotic explanations is rewarded by glossy journals - a kind of inverse Occam's Razor.   He also points out correctly that it's almost impossible for experimentalists to get a result published in a fancy journal without claiming some theoretical explanation.
  • We had a great physics colloquium here this week by Vincenzo Vitelli of the University of Chicago.  He spoke about a number of things, including "odd elasticity".  See, when relating stresses \(\sigma_{ij}\) to strains \(u_{kl}\), in ordinary elasticity there is a tensor that connects these things: \(\sigma_{ij} = K_{ijkl} u_{kl}\), and that tensor is symmetric:  \(K_{ijkl} = K_{klij}\).  Vitelli and collaborators consider what happens when there is are antisymmetric contributions to that tensor.  This means that a cycle of stress/strain ending back at the original material configuration could add or remove energy from the system, depending on the direction of the cycle.  (Clearly this only makes sense in active matter, like driven or living systems.)  The results are pretty wild - see the videos about halfway down this page.
  • Here's something I didn't expect to see:  a new result out of the Tevatron at FermiLab, which is interesting since the Tevatron hasn't run since 2011.  Quanta has a nice write-up.  Basically a new combined analysis of FermiLab data has a new estimate out for the mass of the W boson along with a claimed improved understanding of systematic errors and backgrounds.  The result is a statement that the W boson is heavier than expectations from the Standard Model by an amount that is estimated to be 7 standard deviations.  The exotic explanation (perhaps favored by the inverse Occam's Razor above) is that the Standard Model calculation is off because it's missing some added contributions from so-far-undiscovered particles.  The less exotic explanation is that the new analysis and small error estimates have some undiscovered flaw.  Time will tell - I gather that the LHC collaborations are working on their own measurements. 
  • This result is very impressive.  Princeton investigators have made qubits using spins of single electrons trapped in Si quantum dots, and they have achieved fidelity in 2-qubit operations greater than 99%.  If this is possible in (excellent) university-level fabrication, it does make you wonder whether great things may be possible in a scalable way with industrial-level process control.
  • This is a great interview with John Preskill.  In general the AIP oral history project is outstanding.
  • Well, this is certainly suggestive evidence that the universe really is a simulation.