## Thursday, February 13, 2020

### Film boiling and the Leidenfrost point

While setting up my eddy current bounce demonstration, I was able to film some other fun physics.

Heat transfer and two-phase (liquid+gas) fluid flow is a complicated business that has occupied the time of many scientists and engineers for decades.  A liquid that is boiling at a given pressure is pinned to a particular temperature - that's the way the first-order liquid-vapor transition works.  Water at atmospheric pressure boils at 100 C; adding energy to the liquid water at 100 C via heat transfer converts water into vapor rather than increasing the temperature of the liquid.

Here we are using liquid nitrogen (LN2), which boils at 77 K = -196 C at atmospheric pressure, and are trying to cool a piece of copper plate that initially started out much warmer than that.  When the temperature difference between the copper and the LN2 is sufficiently high, there is a large heat flux that creates a layer of nitrogen vapor between the copper and the liquid.  This is called film boiling.   You've seen this in practice if you've ever put a droplet of water into a really hot skillet, or dumped some LN2 on the floor.  The droplet slides around with very low friction because it is supported by that vapor layer.

Once the temperature difference between the copper and the LN2 becomes small, the heat flux is no longer sufficient to support film boiling (the Leidenfrost point), and the vapor layer collapses - that brings more liquid into direct contact with the copper, leading to more vigorous boiling and agitation.  That happens at about 45 seconds into the video.  Then, once the copper is finally at the same temperature as the liquid, boiling ceases and everything gets calm.

For a more technical discussion of this, see here.  It's written up on a site about nuclear power because water-based heat exchangers are a key component of multiple power generation technologies.

## Tuesday, February 11, 2020

### Eddy currents - bouncing a magnet in mid-air

Changing a magnetic field that permeates a conductor like a metal will generate eddy currents.  This is called induction, and it was discovered by Michael Faraday nearly 200 years ago.   If you move a ferromagnet near a conductor, the changing field produces eddy currents and those eddy currents create their own magnetic fields, exerting forces back on the magnet.  Here is a rather dramatic demo of this phenomenon, shamelessly stolen by me from my thesis adviser.

In the video, you can watch in slow motion as I drop a strong NdFe14B2 magnet from about 15 cm above a 2 cm thick copper plate.  The plate is oxygen-free, high-purity copper, and it has been cooled to liquid nitrogen temperatures (77 K = -196 C).   That cooling suppresses lattice vibrations and increases the conductivity of the copper by around a factor of 20 compared with room temperature.  (If cooled to liquid helium temperatures, 4.2 K, the conductivity of this kind of copper goes up to something like 200 times its room temperature value, and is limited by residual scattering from crystalline grain boundaries and impurities.)

As the magnet falls, the magnetic flux $\Phi$ through the copper increases, generating a circumferential electromotive force and driving eddy currents.  Those eddy currents produce a magnetic field directed to repel the falling magnet.  The currents become large enough that the resulting upward force becomes strong enough to bring the magnet to a halt about 2 cm above the copper (!).  At that instant, $d\Phi/dt = 0$, so the inductive EMF is zero.  However, the existing currents keep going because of the inductance of the copper.  (Treating the metal like an inductor-resistor circuit, the timescale for the current to decay is $L/R$, and $R$ is quite small.)  Those continuing currents generate magnetic fields that keep pushing up on the magnet, making it continue to accelerate upward.  The magnet bounces "in mid air".  Of course, the copper isn't a perfect conductor, so much of the energy is "lost" to resistively heating the copper, and the magnet gradually settles onto the plate.  If you try this at room temperature, the magnet clunks into the copper, because the copper conductivity is worse and the eddy currents decay so rapidly that the repulsive force is insufficient to bounce the magnet before it hits the plate.

(Later I'll make a follow-up post about other neat physics that happens while setting up this demo.)

## Sunday, February 09, 2020

I realized it's been several years since I've run a version of this, and it's the right season....

This is written on the assumption that you have already decided, after careful consideration, that you want to get an advanced degree (in physics, though much of this applies to any other science or engineering discipline).  This might mean that you are thinking about going into academia, or it might mean that you realize such a degree will help prepare you for a higher paying technical job outside academia.  Either way,  I'm not trying to argue the merits of a graduate degree.
• It's ok at the applicant stage not to know exactly what you want to do.  While some prospective grad students are completely sure of their interests, that's more the exception than the rule.  I do think it's good to have narrowed things down a bit, though.  If a school asks for your area of interest from among some palette of choices, try to pick one (rather than going with "undecided").  We all know that this represents a best estimate, not a rigid commitment.
• If you get the opportunity to visit a school, you should go.  A visit gives you a chance to see a place, get a subconscious sense of the environment (a "gut" reaction), and most importantly, an opportunity to talk to current graduate students.  Always talk to current graduate students if you get the chance - they're the ones who really know the score.  A professor should always be able to make their work sound interesting, but grad students can tell you what a place is really like.
• International students may have a very challenging time being able to visit schools in the US, between the expense (many schools can help defray costs a little but cannot afford to pay for airfare for trans-oceanic travel) and visa challenges.  Trying to arrange skype discussions with people at the school is a possibility, but that can also be challenging.  I understand that this constraint tends to push international students toward making decisions based heavily on reputation rather than up-close information.
• Always go someplace where there is more than one faculty member with whom you might want to work.  Even if you are 100% certain that you want to work with Prof. Smith, and that the feeling is mutual, you never know what could happen, in terms of money, circumstances, etc.  Moreover, in grad school you will learn a lot from your fellow students and other faculty.  An institution with many interesting things happening will be a more stimulating intellectual environment, and that's not a small issue.
• You should not go to grad school because you're not sure what else to do with yourself.  You should not go into research if you will only be satisfied by a Nobel Prize.  In both of those cases, you are likely to be unhappy during grad school.
• I know grad student stipends are low, believe me.  However, it's a bad idea to make a grad school decision based purely on a financial difference of a few hundred or a thousand dollars a year.  Different places have vastly different costs of living - look into this.  Stanford's stipends are profoundly affected by the cost of housing near Palo Alto and are not an expression of generosity.  Pick a place for the right reasons.
• Likewise, while everyone wants a pleasant environment, picking a grad school largely based on the weather is silly.
• Pursue external fellowships if given the opportunity.  It's always nice to have your own money and not be tied strongly to the funding constraints of the faculty, if possible.  (It's been brought to my attention that at some public institutions the kind of health insurance you get can be complicated by such fellowships.  In general, I still think fellowships are very good if you can get them.)
• Be mindful of how departments and programs are run.  Is the program well organized?  What is a reasonable timetable for progress?  How are advisors selected, and when does that happen?  Who sets the stipends?  What are TA duties and expectations like?  Are there qualifying exams?  Where have graduates of that department gone after the degree?  Know what you're getting into!  Very often, information like this is available now in downloadable graduate program handbooks linked from program webpages.
• It's fine to try to communicate with professors at all stages of the process.  We'd much rather have you ask questions than the alternative.  If you don't get a quick response to an email, it's almost certainly due to busy-ness, and not a deeply meaningful decision by the faculty member.  For a sense of perspective:  even before I was chair, I would get 50+ emails per day of various kinds not counting all the obvious spam that gets filtered.
There is no question that far more information is now available to would-be graduate students than at any time in the past.  Use it.  Look at departmental web pages, look at individual faculty member web pages.  Make an informed decision.  Good luck!

## Wednesday, January 29, 2020

### Charles Lieber

As one of the only surviving nano-related blogs, I feel somewhat obligated to write a post about this.  Charles Lieber, chair of the department of chemistry and chemical biology at Harvard, was arrested yesterday by the FBI on charges of fraud.  Lieber is one of the premier nano researchers in the world.  The relevant documents are here (pdf) and they make for quite a read.

In brief, Lieber is alleged to have signed on to China's Thousand Talents program with an affiliation at Wuhan University of Technology back in 2011.  This involved the setting up of a joint research lab in Wuhan and regular interactions, including WUT students to come to Harvard.  That in itself is not necessarily problematic.  Much more concerning is the claim that WUT would pay \$50K/month (plus living expenses) for his involvement, and the stipulation in the agreement that he would be working at least nine months/yr with them.  That alone would raise serious conflict-of-commitment and percentage-effort issues.  Worse is the allegation that this went on for years, none of this was disclosed appropriately, and in fact was denied to both DOD and (via Harvard internal folks) NIH.

These allegations are shocking, and the story is hard to fathom for multiple reasons.

Putting on my department chair hat, I can't help but think about how absolutely disruptive this will be for his students and postdocs, since he was placed on immediate leave.  It will be a nontrivial task for the department and the Faculty of Arts and Sciences at Harvard to come up with a way to transition the students to other advising and pay circumstances, and even more challenging for the postdocs.  What a mess.

## Wednesday, January 22, 2020

### Stretchy bioelectronics and ptychographic imaging - two fun talks

One of the great things about a good university is the variety of excellent talks that you can see.

Yesterday we had our annual Chapman Lecture on Nanotechnology, in honor of Rice alum Richard Chapman, who turned down a first-round draft selection to the Detroit Lions to pursue a physics PhD and a career in engineering.  This year's speaker was Zhenan Bao from Stanford, whom I know from back in my Bell Labs postdoc days.  She spoke about her group's remarkable work on artificial skin:  biocompatible, ultraflexible electronics including active matrices of touch sensors, transistors, etc.  Here are a few representative papers that give you some idea of the kind of work that goes into this: Engineering semiconducting polymers to have robust elastic properties while retaining high charge mobilities; a way of combining conducting polymers (PEDOT) with hydrogels so that you can pattern them and then hydrate to produce super-soft devices; a full-on demonstration of artificial skin for sensing applications.  Very impressive stuff.

Today, we had a colloquium by Gabe Aeppli of ETH and the Paul Scherrer Institute, talking about x-ray ptychographic imaging.  Ptychography is a simple enough idea.  Use a coherent source of radiation to illuminate some sample at some spot, and with a large-area detector, measure the diffraction pattern.  Now scan the spot over the sample (including perhaps rotating the sample) and record all those diffraction patterns as well.  With the right approach, you can combine all of those diffraction patterns and invert to get the spatial distribution of the scatterers (that is, the matter in the sample).  Sounds reasonable, but these folks have taken it to the next level (pdf here).   The video I'm embedding here is the old result from 2017.  The 2019 paper I linked here is even more impressive, able to image, nondestructively, in 3D, individual circuit elements within a commercial integrated circuit at nanoscale resolution.  It's clear that a long-term goal is to be able to image, non-destructively, the connectome of brains.

## Monday, January 20, 2020

### Brief items

Here are some items of interest:

• An attempt to lay out a vision for research in the US beyond Science: The Endless Frontier.  The evolving roles of the national academies are interesting, though I found the description of the future of research universities to be rather vague - I'm not sure growing universities to the size of Arizona State is the best way to provide high quality access to knowledge for a large population.  It still feels to me like an eventual successful endpoint for online education could be natural language individualized tutoring ("Alexa, teach me multivariable calculus."), but we are still a long way from there.
• Atomic-resolution movies of chemistry are still cool.
• Dan Ralph at Cornell has done a nice service to the community by making his lecture notes available on the arxiv.  The intent is for these to serve as a supplement to a solid state course such as one out of Ashcroft and Mermin, bringing students up to date about Berry curvature and topology at a similar level to that famous text.
• This preprint tries to understand an extremely early color photography process developed by Becquerel (the photovoltaic one, who was the father of the radioactivity Becquerel).  It turns out that there are systematic changes in reflectivity spectra of the exposed Ag/AgCl films depending on the incident wavelength.  Why the reflectivity changes that way remains a mystery to me after reading this.
• On a related note, this led me to this PNAS paper about the role of plasmons in the daguerreotype process.  Voila, nanophotonics in the 19th century.
• This preprint (now out in Nature Nano) demonstrates incredibly sensitive measurements of torques on very rapidly rotating dielectric nanoparticles.  This could be used to see vacuum rotational friction.
• The inventors of chemically amplified photoresists have been awarded the Charles Stark Draper prize.  Without that research, you probably would not have the computing device sitting in front of you....

## Tuesday, January 14, 2020

### The Wolf Prize and how condensed matter physics works

The Wolf Prize in Physics for 2020 was announced yesterday, and it's going to Pablo Jarillo-Herrero, Allan MacDonald, and Rafi Bistritzer, for twisted bilayer graphene.  This prize is both well-deserved and a great example of how condensed matter physics works.

MacDonald and Bistritzer did key theory work (for example) highlighting how the band structure of twisted bilayer graphene would become very interesting for certain twist angles - how the moire pattern from the two layers would produce a lateral periodicity, and that interactions between the layers would lead to very flat bands.  Did they predict every exotic thing that has been seen in this system?  No, but they had the insight to get key elements, and the knowledge that flat bands would likely lead to many competing energy scales, including electron-electron interactions, the weak kinetic energy of the flat bands, the interlayer coupling, effective magnetic interactions, etc.  Jarillo-Herrero was the first to implement this with sufficient control and sample quality to uncover a remarkable phase diagram involving superconductivity and correlated insulating states.  Figuring out what is really going on here and looking at all the possibilities in related layered materials will keep people busy for years.   (As an added example of how condensed matter works as a field, Bistritzer is in industry working for Applied Materials.)

All of this activity and excitement, thanks to feedback between well-motivated theory and experiment, is how the bulk of physics that isn't "high energy theory" actually works.

## Monday, January 13, 2020

### Popular treatment of condensed matter - topics

I'm looking more seriously at trying to do some popularly accessible writing about condensed matter.  I have a number of ideas about what should be included in such a work, but I'm always interested in other peoples' thoughts on this.   Suggestions?

## Sunday, January 05, 2020

### Brief items

Happy new year.  As we head into 2020, here are a few links I've been meaning to point out:

• This paper is a topical review of high-throughput (sometimes called combinatorial) approaches to searching for new superconductors.   The basic concept is simple enough:  co-deposit multiple different elements in a way that deliberately produces compositional gradients across the target substrate.  This can be done via geometry of deposition, or with stencils that move during the deposition process.  Then characterize the local properties in an efficient way across the various compositional gradients, looking for the target properties you want (e.g., maximum superconducting transition temperature).  Ideally, you combine this with high-throughput structural characterization and even annealing or other post-deposition treatment.  Doing all of this well in practice is a craft.
• Calling back to my post on this topic, Scientific American has an article about wealth distribution based on statistical mechanics-like models of economies.   It's hard for me to believe that some of these insights are really "new" - seems like many of these models could have been examined decades ago....
• This is impressive.  Jason Petta's group at Princeton has demonstrated controlled entanglement between single-electron spins in Si/SiGe gate-defined quantum dots separated by 4 mm.  That may not sound all that exciting; one could use photons to entangle atoms separated by km, as has been done with optical fiber.  However, doing this on-chip using engineered quantum dots (with gates for tunable control) in an arrangement that is in principle scalable via microfabrication techniques is a major achievement.
• Just in case you needed another demonstration that correlated materials like the copper oxide superconductors are complicated, here you go.  These investigators use an approach based on density functional theory (see here, here, and here), and end up worrying about energetic competition between 26 different electronic/magnetic phases.  Regardless of the robustness of their specific conclusions, just that tells you the inherent challenge of those systems:  Many possible ordered states all with very similar energy scales.