Wednesday, July 08, 2020

Brief items

Some further items of note:
  • There is great anxiety and frustration over the latest pronouncement from DHS/ICE about international students in the US.  Let me give a little context.  For many years there has been a rule that international students studying in the US can take no more than 3 credits (or equivalent) per semester of purely online instruction. The point of that was to prevent many people from applying for F visas and then "studying" at online-only diploma mills while actually working. That is, it was originally a policy meant to encourage that student visas go to legitimate international students and scholars pursuing degrees at accredited universities.  In the spring when the pandemic hit and many universities transitioned to online instruction in the middle of the semester, DHS granted a waiver on this requirement.  Well, now they are trying to rescind that, and are doing so in a particularly draconian way: As written, if a university goes online-only, either from the start of the semester or even partway through due to public health concerns, the international students would face having to leave the US on short notice.   This is a terrible, stupid, short-sighted way to handle this situation, and it doesn't remotely serve the best interests of any constituency (student, university, or country).  Unsurprisingly, many many organizations are pushing back against this.  Hopefully there will be changes and/or workarounds.  
  • On to science.  Quanta has an article about the origins of the rigidity of glass.  The discussion there is about whether there is a kind of hidden structural order in the glassy material.  Fundamentally (as I've written previously), rigidity in any solid results from a combination of very slow timescales for atomic motion (due to lack of thermal energy available to overcome "barriers") and the Pauli principle giving a hard-core repulsion between atoms.  Still, the question of the underlying nature of glassy systems remains fascinating.
  • The 2D materials experts at Columbia have shown clean fractional quantum Hall physics in a monolayer of WSe<sub>2</sub>.  The actual paper is here.  I have yet to come up with a really nice, generally accessible write-up of the FQH effect. The super short version:  Confine charge carriers in strictly two dimensions, and throw in a large magnetic field perpendicular to the plane (such that the energy associated with cyclotron motion dominates the kinetic energy). At certain ratios of magnetic field to number of charge carriers, the charge carriers can condense into new collective states (generally distinguished by topology rather than broken symmetries like the liquid-gas or nonmagnetic/ferromagnetic phase transitions).  The fractional quantum Hall states can have all sorts of unusual properties, but the key point here is that they are fragile.  Too much disorder (like missing atoms or charged impurities), and the energy associated with that disorder can swamp out the energy savings of condensing into such a state.  It's remarkable that the material quality of the monolayer transition metal dichalcogenide (and its encapsulating boron nitride surroundings) is so high.  Seeing how FQH states evolve in this example new material system with rich band structure should be interesting.
  • I feel bad for only now learning about this great series of talks about the state of the art in spintronics, trying to understand, engineer, and control the motion of spin.
  • For your animal video needs, get the behind-the-scenes story about Olive and Mabel here.

Tuesday, June 30, 2020

How do hot electrons get hot?

We have a paper that came out today that was very fun.  It's been known for a long time that if you apply a sufficiently large voltage \(V\) to a tunnel junction, it is possible to get light emission, as I discussed here a bit over a year ago, and as is shown at the right.  Conventionally, the energy of the emitted photons \(\hbar \omega\) is less than \(eV\) (give or take the thermal energy scale \(k_{\mathrm{B}}T\) ) if the idea is that single-electron processes are all that can happen.  

In this new paper looking at planar metal tunnel junctions, we see several neat things:
  • The emitted spectra look like thermal radiation with some effective temperature for the electrons and holes \(T_{\mathrm{eff}}\), emitted into a device-specific spectral shape and polarization (the density of states for photons doesn't look like that of free space, because the plasmon resonances in the metal modify the emission, an optical antenna effect).   
    Once the effective temperature is taken into account, the raw spectra (left)
    all collapse onto a single shape for a given device.

  • That temperature \(T_{\mathrm{eff}}\) depends linearly on the applied voltage, when looking at a whole big ensemble of devices.  This is different than what others have previously seen.  That temperature, describing a steady-state nonequilibrium tail of the electronic distribution local to the nanoscale gap, can be really high, 2000 K, much higher than that experienced by the atoms in the lattice.
  • In a material with really good plasmonic properties, it is possible to have almost all of the emitted light come out at energies larger than \(eV\) (as in the spectra above).  That doesn't mean we're breaking conservation of energy, but it does mean that the emission process is a multi-electron one.  Basically, at comparatively high currents, a new hot carrier is generated before the energy from the last (or last few) hot carriers has had a chance to leave the vicinity (either by carrier diffusion or dumping energy to the lattice). 
  • We find that the plasmonic properties matter immensely, with the number of photons out per tunneling electron being 10000\(\times\) larger for pure Au (a good plasmonic material) than for Pd (a poor plasmonic material in this enegy range).  
That last point is a major clue.  As we discuss in the paper, we think this implies that plasmons don't just couple the light out efficiently.  Rather, the plasmons also play a key role in generating the hot nonequilibrium carriers themselves.   The idea is that tunneling carriers don't just fly through - they can excite local plasmon modes most of which almost immediately decay into hot electron/hole excitations with energies up to \(eV\) away from the Fermi level.  Hot carriers are potentially useful for a lot of things, including chemistry.  I'm also interested in whether some fun quantum optical effects can take place in these extreme nanoscale light sources.  Lots to do!

Saturday, June 27, 2020

Brief items

Some science items that crossed my path that you may find interesting:
  • This article at Quanta is a nice look at the Ising model for a general audience.  When I took graduate statistical mechanics from Lenny Susskind, he told the story of Lars Onsager just casually mentioning on the middle of a conference talk that Onsager had solved the 2D Ising model exactly.
  • If you have any interest in the modern history of advanced transistors, the special FinFET ones that are now the mainstays of ultrascaled high performance processors, you might find this article to be fun.
  • With all the talk about twisted bilayers of van der Waals materials for exotic electronic properties, it’s cool to see this paper, which looks at the various nonlinear optical processes that can be enabled in similar structures.  Broken structural symmetries are the key to allowing certain nonlinear processes, and the moire plus twist approach is quite the playground.
  • This preprint is very cool, where the authors have made basically an interferometer in the fractional quantum Hall regime for electrons confined in 2D, and can show clean results that demonstrate nontrivial statistics.  The aspect of this that I think is hard for non-experimentalists to appreciate is how challenging it is to create a device like this that is so clean - the fractional quantum Hall states are delicate, and it is an art form to create devices to manipulate them without disorder or other problems swamping what you want to measure.
Coming at some point, a post or two about my own research.

Wednesday, June 24, 2020

A nation of immigrants

Real life has been intruding rudely on my blogging time.  I will try to step up, but nothing seems to be slowing down this summer.

I sense from the comments on my last post that there is some demand to talk about US immigration policy as it pertains to the scientific community (undergraduate and graduate students, postdocs, scholars, faculty members).  I've been doing what little I can to try to push back against what's going on.  I think the US has benefited enormously from being a training destination for many of the world's scientists and engineers - the positive returns to the country overall and the economy have been almost unquantifiably large.  Current policies seem to me to be completely self-defeating.  As I wrote over three years ago alluding to budget cuts (which thankfully Congress never implemented), there is hysteresis and an entropic component in policy-making.  It's depressingly easy to break things that can be very difficult to repair.  Using immigration policy to push away the world's scientists and engineers from the US is a terrible mistake that runs the risk of decades of long-term negative consequences.

Monday, June 15, 2020

The foil electret microphone

Pivoting back toward science by way of technology.... Some very large fraction of the microphones out there in electronic gadgets are based on electrets.  An electret is an insulating material with a locked-in electrical polarization - for example, take a molten or solvated polymer, embed highly polar molecules in there, and solidify in the presence of a large polarizing electric field.  The electrical polarization means that there is an effective surface charge density.  You can make that electret into a free-standing foil or a film coating a backing to make a diaphragm.  When that film vibrates, it will generate an oscillating voltage on a nearby electrode (which could, say, be the gate electrode of a field-effect transistor).  Voila - a microphone that is simple, readily manufacturable, and doesn't need an external power supply.  

While electret microphones are losing some marketshare to microelectromechanical ones in things like airpods, they've played a huge part in now ubiquitous phone and acoustic technologies in the late 20th and early 21st centuries.  When I was a postdoc I was fortunate one day to meet their coinventor, James West, who was still at Bell Labs, when (if I recall correctly) his summer student gave a presentation on some lead-free ultra-adhesive solder they were working on.  He was still patenting inventions within the last two years, in his late 80s - impressive!

Monday, June 08, 2020

Change is a depressingly long time in coming.

People don't read this blog for moralizing, and I surely don't have any particular standing, but staying silent out of concern for saying the wrong thing isn't tenable.  Black lives matter.  There is no more stark reminder of the depressingly long timescales for social progress than the long shadow cast by the US history of slavery.  I have to hope that together we can make lasting change - the scale of the outpouring in the last week has to be a positive sign.  The AAAS announced that on Wednesday June 10 they will be "observing #shutdownSTEM, listening to members of our community who are sharing resources and discussing ways to eliminate racism and make STEM more inclusive of Black people. www.shutdownstem.com. We encourage you to join us."  It's a start.

Wednesday, June 03, 2020

Non-academic careers and physics PhDs

With so many large-scale events happening right now (the pandemic, resulting economic displacement, the awful killing of George Floyd and resulting protests and unrest, federal moves regarding international students), it's hard not to feel like blogging is a comparatively self-indulgent activity.  Still, it is a way to try to restore a feeling of normalcy.  

The Pizza Perusing Physicist had asked, in this comment, if I could offer any guidance about non-academic careers for physics PhDs (including specific fields and career paths), beyond cliches about how PhD skills are valued by many employers.  I don't have any enormous font of wisdom on which to draw, but I do have a few points:
  • I do strongly recommend reading A PhD is Not Enough.  It's a bit older now, but has good insights.
  • It is interesting to look at statistics on where people actually land.  According to the AIP, about a half of physics PhDs take initial academic jobs (postdocs and others); a third go to the private sector; and 14% go to government positions.  Similarly, you can see the skills that recent PhDs say they use in their jobs.  
  • I found it particularly interesting to read the comments from people ten years out from their degrees, since they have some greater perspective - seriously, check out that document.
  • Those latter two AIP documents show why "PhD skills are valued by employers" has become cliched - it's true.
  • In terms of non-academic career options for physics PhDs, there really are a wide variety, though like any career trajectory a great deal depends on the skills, flexibility, and foresight of the person.  Technical problem solving is a skill that a PhD should have learned - how to break big problems up into smaller ones, how to consider alternatives and come up with ways to test those, etc.  There is a often a blurry line between physics and some types of engineering, and it is not uncommon for physics doctorates to get jobs at companies that design and manufacture stuff - as a condensed matter person, I have known people who have gone to work at places like Intel, Motorola, Seagate, National Instruments, Keysight (formerly Agilent), Northrup Grumman, Lockheed, Boeing, etc.  It is true that it can be hard to get your foot in the door and even know what options are available.  I wish I had some silver bullet on this, but your best bets are research (into job openings), networking, and career fairs including at professional conferences.  Startups are also a possibility, though those come with their own risks.  Bear in mind that your detailed technical knowledge might not be what companies are looking for - I have seen experimentalist doctoral students go be very successful doing large-scale data analysis for oil services firms, for example.  Likewise, many people in the bioengineering and medical instrumentation fields have physics backgrounds.
  • If academia isn't for you, start looking around on the side early on.  Get an idea of the choices and a feel for what interests you. 
  • Make sure you're acquiring skills as well as getting your research done.  Learning how to program, how to manipulate and analyze large data sets, statistical methods - these are generally useful, even if the specific techniques evolve rapidly.
  • Communication at all levels is a skill - work at it.  Get practice writing, from very short documents (summarize your research in 150 words so that a non-expert can get a sense of it) to papers to the thesis.  Being able to write and explain yourself is essential in any high level career.  Get practice speaking with comfort, from presentations to more informal 1-on-1 interactions.  Stage presence is a skill, meaning it can be learned.  
  • Don't discount think tanks/analysis firms/patent firms - people who can tell the difference between reality and creative marketing language (whether about products or policies) are greatly valued.
  • Similarly, don't discount public policy or public service.  The fraction of technically skilled people in elected office in the US is woefully small (while the chancellor of Germany has a PhD in quantum chemistry).  These days, governing and policy making would absolutely benefit from an infusion of people who actually know what technology is and how it works, and can tell the difference between an actual study and a press release.
I'm sure more things will occur to me after I publish this.  There is no one-size-fits-all answer, but that's probably a good thing.

Wednesday, May 27, 2020

The National Science and Technology Foundation?

A proposal is being put in front of Congress that would reshape the National Science Foundation into the National Science and Technology Foundation.  The Senate bill is here, and the House equivalent bill is here.  The actual text of the Senate bill is here in pdf form.   In a nutshell, this "Endless Frontiers" bill (so named to echo the Vannevar Bush report that spurred the creation of the NSF in the first place) would do several things, including:
  • Create a Technology Directorate with its own advisory board (distinct from the National Science Board)
  • Would identify ten key technology areas (enumerated in the bill, initially (i) artificial intelligence and machine learning; (ii) high performance computing, semiconductors, and advanced computer hardware; (iii) quantum computing and information systems; (iv) robotics, automation, and advanced manufacturing; (v) natural or anthropogenic disaster prevention; (vi) advanced communications technology; (vii) biotechnology, genomics, and synthetic biology; (viii) cybersecurity, data storage, and data management technologies; (ix) advanced energy; and (x) materials science, engineering, and exploration relevant to the other key technology focus areas)
  • Would have funds allocated by program managers who may use peer review in an advisory role (so, more like DOD than traditional NSF)
  • Invest $100B over 5 years, with the idea that the rest of NSF would also go up, but this new directorate would get the large bulk of the funding
This article at Science does a good job outlining all of this.  The argument is, basically, that the US is lagging in key areas and is not doing a good job translating basic science into technologies that ensure international primacy (with China being the chief perceived rival, though this is unstated in the bills of course).  If this came to pass, and it's a big "if", this could fundamentally alter the character and mission of the NSF.  Seeing bipartisan congressional enthusiasm for boosting funding to the NSF is encouraging, but I think there are real hazards in pushing funding even farther toward applications, particularly in a governance and funding-decision model that would look so different than traditional NSF.  

It's worth noting that people have been having these arguments for a long time.  Here is a 1980 (!) article from Science back when a "National Technology Foundation" proposal was pending before Congress, for exactly the same perceived reasons (poor translation of basic science into technology and business competitiveness, though the Soviets were presumably the rivals with whom people were concerned about competing).  The NSF has their own history that mentions this, and how this tension led to the creation of the modern Engineering Directorate within NSF.  

Interesting times.  Odds are this won't pass, but it's a sign of bipartisan concern about the US falling behind its technological rivals.

Wednesday, May 20, 2020

Yet more brief items

Between writing deadlines, battling with reviewer 3 (I kid, I kid), and trying to get set for the tentative beginnings of restarting on-campus research, it's been a busy time.  I really do hope to do more blogging soon (suggested topics are always appreciated), but for now, here are a few more brief items:
  • This expression of editorial concern about this paper was an unwelcome surprise.  Hopefully all will become clear.  Here is a statement by the quantum information science-related center at Delft.
  • I happened across this press release, pointing out that nVidia's new chip will contain 54 billion transistors (!) fabbed with a 7 nm process.  For reference, the "7 nm" there is a label describing particular fabrication processes using finFETs, and doesn't really correspond to a physical feature size of 7 nm.  I discussed this here before.  Still impressive.
  • There is a lot of talk about moving cutting-edge semiconductor fabrication plants back to the US.  Intel and parts of GlobalFoundries aside, a large fraction of high end chip volume is produced outside the US.  There have long been national security and intellectual property concerns about the overseas manufacturing of key technologies, and the US DOD has decided that bringing some of this capability back on-shore is safer and more secure.  I'm surprised it's taken this long, though the enormous capital cost in setting up a foundry explains why these things are often done by large consortia.  The pandemic has also shown that depending on overseas suppliers for just-in-time delivery of things may not be the smartest move.
  • Speaking of that, I can't help but wonder about the cycle of unintended consequences that we have in our economic choices.  I've ranted (way) before about how the way the stock market and corporate governance function these days has basically squeezed away most industrial basic research.  Those same attitudes gave us "just-in-time" manufacturing and somehow convinced generations of corporate management that simple things like warehouses and stockrooms were inherently bad.  "Why keep a stockroom around, when you can always order a M5 allen head bolt via the internet and get it shipped overnight from hundreds or thousands of miles away?" runs the argument, the same kind of bogus accounting that implies that the continued existence of a space in the Bell Labs parking lot used to cost Lucent $30K/yr.   So, companies got rid of inventory, got rid of local suppliers, and then were smacked hard by the double-whammy of a US-China trade war and a global pandemic.  Now we are being bombarded with breathless stories about how the pandemic and people working from home might mean the complete delocalization of work - a vision of people working from anywhere, especially places more financially sustainable than the Bay Area.  I'm all for telecommuting when it makes sense, and minimizing environmental impact, and affordable places to live.  That being said, it's hard not to feel like a truly extreme adoption of this idea is risky.  What if, heaven forbid, there's a big disruption to the communications grid, such as a Carrington Event?  Wouldn't that basically obliterate the ability of completely delocalized companies to function?  
  • To end on a much lighter note, these videos (1, 2, 3, 4) have been a positive product of the present circumstances, bringing enjoyment to millions.

Sunday, May 10, 2020

Brief items

Apologies for the slowed frequency of posting.  Academic and research duties have been eating a lot of bandwidth.  Here are a few items that may be of interest:

  • This article about Shoucheng Zhang is informative, but at the same time very sad.  Any geopolitics aside, he was an intense, driven person who put enormous pressure on himself.  It says something about self-perception under depression that he was concerned that he was somehow not being recognized.  
  • This paper caught my eye.  If you want to see whether there is some dependence of electronic conduction on the relative directions of a material's crystal axes and the current, it makes sense to fabricate a series of devices oriented in different directions.  These authors take a single epitaxial film of a material (in this case the unconventional superconductor Sr2RuO4) and carve it into a radial array of differently oriented strips of material with measurement electrodes.   They find that there do seem to be "easy" and "hard" directions for transport in the normal state that don't have an obvious relationship to the crystal symmetry directions.  A similar approach was taken here in a cuprate superconductor.  
  • I like the idea of making characterization tools broadly available for low cost - it's great for the developing world and potentially for use in public secondary education.  This work shows plans for a readily producible optical microscope that can have digital imaging, motorized sample positioning, and focusing for a couple of hundred dollars.  Fancier than the foldscope, but still very cool.  Time to think more about how someone could make a $100 electron microscope....
  • Here is a nice review article from the beginning of the year about spin liquids.
  • I was going to point out this article about ultralow temperature nanoelectronics back in March, but the pandemic distracted me.  From grad school I have a history in this area, and the progress is nice to see.  The technical challenges of truly getting electrons cold are formidable.

Thursday, April 30, 2020

On the flexural rigidity of a slice of pizza

People who eat pizza (not the deep dish casserole style from Chicago, but normal pizza), unbeknownst to most of them, have developed an intuition for a key concept in elasticity and solid mechanics. 

I hope that all right-thinking people agree that pizza slice droop (left hand image) is a problem to be avoided.  Cheese, sauce, and toppings are all in serious danger of sliding off the slice and into the diner's lap if the tip of the slice flops down.  Why does the slice tend to droop?   If you hold the edge of the crust and try to "cantilever" the slice out into space, the weight of the sauce/toppings exerts downward force, and therefore a torque that tries to droop the crust.  

A simple way to avoid this problem is shown in the right-hand image (shamelessly stolen from here).  By bending the pizza slice, with a radius of curvature around an axis that runs from the crust to the slice tip, the same pizza slice becomes much stiffer against bending.   Why does this work?  Despite what the Perimeter Institute says here, I really don't think that differential geometry has much to do with this problem, except in the sense that there are constraints on what the crust can do if its volume is approximately conserved.  

The reason the curved pizza slice is stiffer turns out to be the same reason that an I-beam is stiffer than a square rod of the same cross-sectional area.  Imagine an I-beam with a heavy weight (its own, for example) that would tend to make it droop.  In drooping a tiny bit, the top of the I-beam would get stretched out, elongated along the \(z\) direction - it would be in tension.  The bottom of the I-beam would get squeezed, contracted along the \(z\) direction - it would be in compression.  Somewhere in the middle, the "neutral axis", the material would be neither stretched nor squeezed.  We can pick coordinates such that the line \(y=0\) is the neutral axis, and in the linear limit, the amount of stretching (strain) at a distance \(y\) away from the neutral axis would just be proportional to \(y\).  In the world of linear elasticity, the amount of restoring force per unit area ("normal stress") exhibited by the material is directly proportional to the amount of strain, so the normal stress \(\sigma_{zz} \propto y\).  If we add up all the little contributions of patches of area \(\mathrm{d}A\) to the restoring torque around the neutral axis, we get something proportional to \(\int y^2 \mathrm{d}A\).  The bottom line:  All other things being equal, "beams" with cross-sectional area far away from the neutral axis resist bending torques more than beams with area close to the neutral axis.

Now think of the pizza slice as a beam.  (We will approximate the pizza crust as a homogeneous elastic solid - not crazy, though really it's some kind of mechanical metamaterial carbohydrate foam.)  When the pizza slice is flat, the farthest that some constituent bit of crust can be from the neutral axis is half the thickness of the crust.  When the pizza slice is curved, however, much more of its area is farther from the neutral axis - the curved slice will then resist bending much better, even made from the same thickness of starchy goodness as the flat slice.

(Behold the benefits of my engineering education.) 

Wednesday, April 22, 2020

Brief items

A few more links to help fill the time:
  • Steve Simon at Oxford has put his graduate solid state course lectures online on youtube, linked from here.  I'd previously linked to his undergrad solid state lectures.   Good stuff, and often containing fun historical anecdotes that I hadn't known before.
  • Nature last week had this paper demonstrating operations of Si quantum dot-based qubits at 1.5 K with some decent fidelity.  Neat, showing that long electron spin coherence times are indeed realizable in these structures at comparatively torrid conditions.
  • Speaking of quantum computing, it was reported that John Martinis is stepping down as lead of google's superconducting quantum computation effort (these folks).  I've always thought of him as an absolutely fearless experimentalist, and while no one is indispensable, his departure leads me to lower my expectations about google's progress.  Update:  Forbes has a detailed interview with Martinis about this.  It's a very interesting inside look.  
  • I'd never heard of "the Poynting effect" before, and I thought this write-up was very nice.

Sunday, April 19, 2020

This week in the arxiv - magnons

Ages ago I wrote a description of magnons, which I really should revise.  The ultra-short version:  Magnetically ordered materials are classified by long-ranged patterns of how electronic spins are arranged.  For example, in a (single domain) ferromagnet, the spins all point in the same direction, and it costs energy to perturb that arrangement by tipping a spin.  Classically one can define spin waves, where there is some spatially periodic perturbation of the spin orientation (described by some wave vector \(\mathbf{k}\), and that perturbation then oscillates in time with frequency \(\omega(\mathbf{k})\), like any of a large number of wave-like phenomena.  In the quantum limit, one can talk about the energy of exciting a single magnon, \(\hbar \omega\).  One can use this language to about making wavepackets and propagating magnons to transport angular momentum.

Two papers appeared on the arxiv this week, back-to-back, taking this to the next level.  In condensed matter physics some of the most powerful techniques, in terms of learning about material properties and how they emerge, involve scattering.  That is, taking some probe (say visible light, x-rays, electrons, or neutrons) with a well-defined energy and momentum, firing it at a target of interest, and studying the scattered waves to learn about the target.  A related approach involves interferometery, where the propagation of waves (detected through changes in amplitude and phase) is sensitive to the local environment.  

The two preprints (this one and this one) establish that it is now possible to use magnons in both approaches.  This will likely open up a new route for characterizing and understanding micro- and nanoscale magnetic materials, which will be extremely useful (since, as I had to explain to a referee on a paper several years ago, it's actually not possible to use neutron scattering to probe a few-micron wide, few nm thick piece of material.)  In the former paper, magnons in yttrium iron garnet (a magnetic insulator called YIG, not to be confused with Yig, the Father of Serpents) are launched toward and scattered from a patch of permalloy film, and the scattered waves are detected and imaged sensitively.  In the latter, propagation and interference of magnons in YIG waveguides is imaged.  The great enabling technology for both of these impressive experiments has been the development over the last decade or so in the use of nitrogen-vacancy centers in diamond as incredibly sensitive magnetometers.   Very pretty stuff.


Sunday, April 12, 2020

What are anyons?

Because of the time lag associated with scientific publishing, there are a number of cool condensed matter results coming out now in the midst of the coronavirus impact.  One in particular prompted me to try writing up something brief about anyons aimed at non-experts.  The wikipedia article is pretty good, but what the heck.

One of the subtlest concepts in physics is the idea of "indistinguishable particles".  The basic idea seems simple.  Two electrons, for example, are supposed to be indistinguishable.  There is no measurement you could do on two electrons that would find different properties (say size or charge or response to magnetic fields).  For example, I should be able to pop an electron out of a hydrogen atom and replace it with any other electron, and literally no measurement you could do would be able to tell the difference between the hydrogen atoms before and after such a swap.  The consequences of true indistinguishability are far reaching even in classic physics.  In statistical mechanics, whether or not a collection of particles and that same collection with two particles swapped are really the same microscopic state is a big deal, with testable consequences.

In quantum mechanics, the situation is richer.  Let's imagine that the only parameter that matters is position.  (We are going to use position as shorthand to represent all of the quantum numbers associated with some particle.)  We can describe a two-particle system by some "state vector" (or wavefunction if you prefer) \( | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle\), where the first vector is the position of particle 1 and the second is the position of particle 2.  Now imagine swapping the two particles.   After the swap, the state should be \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle\).  The question is, how does that second state relate to the first state?  If the particles are truly indistinguishable, you'd think \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle =  | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \). 

It turns out that that's not the only allowed situation.  One thing that must be true is that swapping the particles can't change the total normalization of the state (how much total stuff there is).  That restriction is written  \( \langle \psi (\mathbf{r_{2}},\mathbf{r_{1}}) | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle = \langle \psi (\mathbf{r_{1}},\mathbf{r_{2}}) | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \).  If that's the most general restriction, then we can have other possibilities than the states before and after being identical.

For bosons, particles obeying Bose-Einstein statistics, the simple, intuitive situation does hold.  \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle =  | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \).

For fermions, particles obeying Fermi-Dirac statistics, instead  \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle = -  | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \).  This also preserves normalization, but has truly world-altering consequences.  This can only be satisfied for two particles at the same position if the state is identically zero.  This is what leads to the Pauli Principle and basically the existence of atoms and matter as we know them.

In principle, you could have something more general than that.  For so-called "abelian anyons", you could have the situation  \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle = (e^{i \alpha})  | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \), where \(\alpha\) is some phase angle.  Then bosons are the special case where \(\alpha = 0\) or some integer multiple of \(2 \pi\), and fermions are the special case where \(\alpha = \pi\) or some odd multiple of \(\pi\).   

You might wonder, how would you ever pick up weird phase angles when particles are swapped in position?  This situation can arise for charged particles restricted to two dimensions in the presence of a magnetic field. The reason is rather technical, but it comes down to the fact that the vector potential \(\mathbf{A}\) leads to complex phase factors like the one above for charged particles.  

This brings me to this paper.  Anyons have been deeply involved in describing the physics of the fractional quantum Hall effect for a long time  (see here for example).  It's tricky to get direct experimental evidence for the unusual phase factor, though.  The authors of this new paper have been basically doing a form of particle swapping via a scattering experiment, and looking at correlations in where the particles end up (via the noise, fluctuations in the relative currents).  They do indeed see what looks like nice evidence for the expected anyonic properties of a particular quantum Hall state.  

(There are also "nonabelian" anyons, but that is for another time.)

Saturday, April 04, 2020

Brief items

A couple of interesting links:

  • From City University of New York, a paper on a bit of the physics relevant to the pandemic - specifically the issue of aerosolized droplets and air circulation in rooms.  The conclusion is that, based on common convection patterns, the best approach to clearing airborne contaminants is a ceiling-mounted suction filter as in surgical operating rooms.  (I suspect that vertical flow ceiling HEPA fan filter units with many air changes per hour as in microfabrication cleanrooms would also work, but it's not like anyone is going to install elevated, gridded flooring everywhere.)  Some of the basic physics of particle suspension is simple enough to teach to high school students, without even getting into viscosity and drag anf real fluid mechanics.  The typical amount of kinetic energy that a would-be suspended particle picks up in collisions with its surroundings is on the order of \(k_{\mathrm{B}}T\), or about 26 meV (\(4.14 \times 10^{-21}\) J).  For a particle to stay readily suspended, that has to be comparable to the gravitational potential energy that it would cost to elevate the particle by its own typical size.  For a spherical droplet of the density of water, you'd be looking at something like \((4/3)\pi R^{3} \cdot \rho \cdot g \cdot 2R\), where \(R\) is the droplet radius, \(\rho\) is the density of water, 1000 kg/m3, and \(g\) is the gravitational acceleration, 9.807 m/s2.  Setting those equal and solving gives \(R \approx 470\) nm.  
  • The always excellent Natalie Wolchover has a new article in Quanta about how one limiting factor in gravitational interferometers is the quality of the glass used in the dielectric mirrors.  Specifically, the tunneling two-level systems (see here and here) in ordinary amorphous insulating dielectrics at low temperatures are a problem.  It's like I've said ever since my doctoral work:  tunneling two-level systems are everywhere, and they're evil.
  • As pointed out by many, this paper has a novel approach to room temperature superconductivity.  This is a bit like my idea of converting my entire lab into ultra-high vacuum workspace.  Sure, personnel would all have to wear special spacesuits, but it would really help preserve samples.
  • In these days of social distancing, this was also amusing.
Please stay safe.  I know it's hard to stay positive while all of this is going on, but remember that you're not alone.



Monday, March 30, 2020

Phil Anderson and the end of an era

Social media spread the word yesterday evening that Phil Anderson, intellectual giant of condensed matter physics, had passed away at the age of 96.

It is hard to overstate the impact that Anderson had on the field.  In terms of pure scientific results, there are others far more skilled than I who can describe his contributions, but I will mention a few that are well known:

  • He developed what is now known as the Anderson model, a theoretical treatment originally intended to capture the essential physics in some transition metal-based magnets.  The model considers comparatively localized d orbitals and includes both hopping to neighboring sites in a lattice as well as the "on-site repulsion" U that makes it energetically expensive to have two electrons (in a spin singlet) on the same site.  This leads to "superexchange" processes, where energetically costly double-occupancy is a virtual intermediate state.  The Anderson model became the basis for many developments - allow coupling between the local sites and delocalized s or p bands, and you get the Kondo model.  Put in coupling to lattice vibrations and you get the Anderson-Holstein model.  Have a lattice and make the on-site repulsion really strong, and you get the Hubbard model famed in correlated electron circles and as the favored treatment of the copper oxide superconductors.
  • Anderson also made defining contributions to the theory of localization.  Electrons in solids are wavelike, and in perfect crystal lattices the ones in the conduction and valence bands propagate right past the ions because the waves themselves account for the periodicity of the lattice.  Anderson showed that even in the absence of interactions (the electron-electron repulsion), disorder can scatter those waves, and interference effects can lead to situations where the final result is waves that are exponentially damped with distance.  This is called Anderson localization, and it applies to light and sound as well as electrons.  With strict conditions, this result implies that (ignoring interactions) infinitesimal amounts of disorder can make a 2D electronic system an insulator.  
  • Here is his Nobel Lecture, by the way, that really focuses on these two topics.
  • In considering superconductivity, Anderson also discovered what is now known as the Higgs mechanism, showing that while the bare excitations of some quantum field theory could be massless, coupling those excitations to some scalar field whose particular value broke an underlying symmetry could lead to an effective mass term (in the sense of how momentum and energy relate to each other) for the originally massless degrees of freedom.  Since Anderson himself wrote about this within the last five years, I have nothing to add.
  • Anderson also worked on superfluidity in 3He, advancing understanding of this first-discovered non-electronic paired superfluid and its funky properties due to p-wave pairing.
  • With the discovery of the copper oxide superconductors, Anderson introduced the resonating valence bond (RVB) model that still shapes discussions of these and exotic spin-liquid systems.
Beyond these and other scientific achievements, Anderson famously articulated a key intellectual selling point of condensed matter physics:  emergent properties from collective actions of large numbers of interacting degrees of freedom can be profound, non-obvious, and contain foundational truths - that reductionism isn't always the path to understanding or "fundamental" insights.  More is different.  He also became a vocal critic about the Superconducting Supercollider.  (For what it's worth, while this certainly didn't help collegiality between high energy and condensed matter physics, there were many factors at play in the demise of the SSC.  Anderson didn't somehow single-handedly kill it.)

Anderson was unquestionably a brilliant person who in many ways defined the modern field of condensed matter physics.  He was intellectually active right up to the end, and he will be missed.  (For one of my own interactions with him, see here.)

Friday, March 20, 2020

(Experimentalist) grad students + postdocs in the time of covid-19

As I write this, a very large fraction of the research universities in the US (and much of the world) are either in a shutdown mode or getting there rapidly.  On-campus work is being limited to "essential" operations.  At my institution (and most of the ones I know about), "essential" means (i) research directly related to diagnosing/treating/understanding covid-19; (ii) minimal efforts necessary to keep experimental animals and cell lines going, as the alternative would be years or decades of lost work; (iii) maintenance of critical equipment that will be damaged otherwise; (iv) support for undergraduates unable to get home.

For people in some disciplines, this may not be that disruptive, but for experimentalists (or field researchers), this is an enormous, unplanned break in practice.  Graduate students face uncertainty (even more than usual), and postdocs doubly so (and I haven't seen anything online discussing their situation.  An eight week hitch in the course of a six year PhD is frustrating, but in a limited-duration postdoc opportunity, it's disproportionately worse.  The economics faced by universities and industry will also complicate the job market for a while.), and are often far from their families.

If we'd experienced something like this before, I could offer time-worn wisdom, but we've never had circumstances like this in the modern (post-WWII) research era.  This whole situation feels surreal to me.  Frankly, focusing and concentrating on science and the routine parts of the job have been a challenge, and I figure it has to be worse for people not as ancient established.  Here are a few thoughts, suggestions, and links as we move to get through this:

  • While we may be physically socially distancing, please talk with your friends, family, and colleagues, by phone, skype, zoom, slack, wechat, whatever.  Try not to get sucked into the cycle of breaking news and the toxic parts of social media.  Please take advantage of your support structure, and if you need to talk to someone professional, please reach out.  We're in this together - you don't have to face everything by yourself.
  • Trying to set up some kind of routine and sticking to it is good.  Faculty I know are trying to come up with ways to keep their folks intellectually engaged - regular group meetings + presentations by zoom; scheduled seminars and discussions via similar video methods across research groups and in some cases even across different universities.  For beginning students, this is a great time to read (really read) the literature and depending on your program, study for your candidacy/qualifier.  Again, you don't have to do this alone; you can team up with partners on this.  For students farther along, data analysis, paper writing, planning the next phase of your research, starting to work on the actual thesis writing, etc. are all possibilities.  For postdocs interested in academia, this is potentially a time to comb the literature and think about what you would like to do as a research program.  Some kind of schedule or plan is the way to divide this into manageable pieces instead of feeling like these are gigantic tasks. 
  • The Virtual March Meeting has continued to add talks.
  • My friend Steve Simon's solid state course lectures are all available.  They go with his book.  They are also just one example of the variety of talks available from Oxford - here are the other physics ones.
  • Here is a set of short pieces about topology in condensed matter from a few years ago.
  • And here is a KITP workshop on this topic from this past fall.
  • These are some very nice lecture notes about scientific computing using python.  Here is something more in-depth on github.  Could be a good time to learn this stuff....
  • On the lighter side, here are both PhD Comics movies for free streaming.
Feel free to leave more suggestions and links in the comments.  I'm sure we could all use them.  Stay safe.

Thursday, March 12, 2020

Exponentials, extrapolation, and prudence

It's been a remarkable week.  There seems to be a consensus among US universities, based in part on CDC guidelines, and in part on the logistically and legally terrifying possibility of having to deal with dormitories full of quarantined undergraduates, that the rest of the 2019-2020 academic year will be conducted via online methods.  This will be rough, but could well be a watershed moment for distance education techniques.  The companies that make the major software platforms (e.g. zoom, canvas) and their web storage are facing a remarkable trial by fire when the nation's large universities all come back from break and hundreds of thousands of students all try to use these tools at once.

At the same time that all this is going on, many doctoral programs around the country (including ours) that had not already done their graduate recruiting visitations were canceling open houses and trying to put together virtual experiences to do the job.  

There is a lot to unpack here, but it's worth asking:  Are people over-reacting?  I don't think so, and over-reacting would be better than the alternative, anyway.  Different estimates give a range of values, but it would appear that the age-averaged mortality rate of covid-19 is somewhere between 0.7% and 3%.  (The current number in the US is something like 2.9%, but that's probably an overestimate due to appallingly too little testing; in the non-Wuhan parts of China it's like 0.6%, but in Italy it's over 3%.)  The disease seems comparable in transmission to the annual influenza, which in the US is estimated to infect 35-40M people every year, and with a mortality rate of around 0.1% leads to something like 35-40K deaths per year.  Given this, it's not unreasonable to think that, unchecked, there could be between 250K and 1.2M deaths from this in the US alone.  A key issue in Italy stems from the hospitalization rate of around 10-15%.  If the cases come too rapidly in time, there just aren't enough hospital beds.  This is why flattening the curve is so important.

It annoys me to see some people whom I generally respect scientifically seem to throw their numerical literacy out the window on this.  We shouldn't freak out and panic, but we should understand the underlying math and assumptions and take this appropriately seriously.

Update:  Notes from a meeting at UCSF (web archive version of link) hosted by, among others, Joe DeRisi.  I first met Joe when we became Packard Fellows back in 2003.  He's a brilliant and very nice guy, who with colleagues created the viral phylogeny chip that identified SARS as a previously unknown coronavirus and pinpointed its closest relatives.

Friday, March 06, 2020

More about the APS meeting(s) and covid-19

Just to follow up:

  • The APS is partnering with the Virtual March Meeting, as well as collecting talks and slides and linking them to the online meeting program.  
  • There is going to be a Virtual Science Forum this afternoon (Eastern Standard Time, Friday, March 6) using zoom as a meeting platform, featuring what would have been March Meeting invited talks by Florian Marquardt, Eun-Ah Kim, and Steve Girvin.
  • The APS is working on refunds.  All told, the society is going to lose millions of dollars on this.
  • I am very surprised that the webpage for the APS April Meeting does not, as of this writing, have anything on it at all about this issue.  I've already passed on my strong suggestion that they at least put up a notice that says "We are closely monitoring the situation and will make a firm decision about the status of the meeting by [date]."  
  • The ACS has a notice on their page about their national meeting scheduled for Philadelphia on March 22-26.  I'm rather surprised that they are still going ahead. Update:  ACS has now cancelled their spring meeting.
  • The MRS seems to have nothing up yet regarding their April meeting.
People tend to have poor intuition about exponential functions.  I'm not an alarmist, but it's important to consider:  total US cases of covid-19 today are the level Wuhan was seven weeks ago. Hopefully measures people are taking (social distancing, hand washing, dropping non-critical travel) plus seasonality of illness plus lower population density plus fewer smokers will help keep things comparatively manageable. The US government realistically will not take some of the steps taken by the Chinese government (e.g., wholesale travel restrictions, military-enforced quarantines).


Tuesday, March 03, 2020

Virtual March Meeting

In the wake of the cancellation of the 2020 APS March Meeting due to concerns about COVID-19, an effort has sprung up, the Virtual March Meeting, with the idea of having would-be speakers record and upload their presentations.   (I believe that this was spearheaded by q-ctrl, but I'm not certain.  If someone knowledgeable about this would like to explain in the comments, that would be very helpful.)

In general, this is a great idea.  There were a number of talks, particularly some invited sessions, that I was very much hoping to see at the meeting, and if this is a way of providing access to at least some of that content, I'm all in favor.

There are some downsides.  No interactive Q&A.  Some people are willing to be speculative and show a couple of in-progress/not-yet-submitted slides in their talks, but they are unlikely to want their pre-publication ideas out there on the internet forever.  It seems unlikely that there will be large-scale participation, particularly by the generally busy folks who are giving the longer invited talks and prize talks.  Still, some effort to accommodate limited travel is better than nothing.

I've attempted to upload my own contributed talk, though it doesn't seem to have materialized yet on their siteHere it is.  If you really want to get the March Meeting experience, you should watch this from the back of a small, uncomfortably crowded room with dodgy air temperature and unreliable audio.  Also, you should pretend that the session chair stands up and starts glowering at me on slide 18.   (This is in the spirit of a comment made by a friend who once said that he couldn't make it to Princeton reunions, so instead he was going to simulate the experience by pouring beer and mud in his shoes and squishing around in the humidity.)

Saturday, February 29, 2020

APS March Meeting cancelled

Hello all - I have just heard from Dan Arovas, program chair of the APS March Meeting, that the APS has decided to cancel the meeting, which was scheduled to begin tomorrow: "Just finished a Zoom meeting with APS CEO Kate Kirby, APS presidential line, secretary treasurer, counselor. APS is preparing a statement for release to the press. Right now you can help by informing all your students, postdocs, and colleagues. The web site will be updated as soon as possible."

This is a response to COVID19. As I post this, the meeting website has not yet been updated.  I will post more when I learn more.

Update: The text of the APS email: "Due to rapidly escalating health concerns relating to the spread of the coronavirus disease (COVID-19), the 2020 APS March Meeting in Denver, CO, has been canceled. Please do not travel to Denver to attend the March Meeting. More information will follow shortly."

Update: APS website now confirms.

Update: Here is the text of the letter from the APS president and CEO about the decision.
To the Board, the Council and Unit Leaders of APS:
You have probably already heard that on Saturday, February 29, the APS Leadership decided to cancel the 2020 March Meeting in Denver. We are writing to give you some of the details that led to this difficult decision, which was made in consultation with the APS senior management and the March Meeting program chair.
APS leadership has been monitoring the spread of the coronavirus disease (COVID-19) in the days leading up to the meeting. As you know, a large number of March Meeting attendees come from outside the US. Many have already canceled their attendance, particularly those from China, where travel to the meeting is not currently possible. In addition, we had many planning to come from countries where the CDC has upgraded its warning to level 3 as recently as the day of our decision, yesterday February 29. Even more were coming from countries where the virus appears to be establishing itself in the general population, so that the warning level could rise during the course of our meeting, which might significantly delay their return travel or even lead to quarantines.
In this case the safety of the attendees has to be a primary concern. There is a reasonable expectation that in a meeting with many thousands of participants, some will fall ill. This always happens of course, but it presently takes some time to establish whether an illness is seasonal flu or COVID-19, and many attendees who have come into contact might need to be quarantined during the testing. In light of this danger, we realized that ordinary social events such as the evening receptions would have to be cancelled out of caution.
We appreciate the high cost of our decision, both for the APS and also the attendees. We don’t know the actual loss yet, but the APS portion alone is certain to be in the millions of dollars. We want to assure the APS Board, Council, and Unit Leaders, that we have considered this carefully. Our society is strong financially, and we can absorb this loss. The welfare of our community is certainly a greater concern.
We know you have many questions about the path forward following this decision. We will continue to communicate and confer with you regularly in the coming weeks, as we all come to terms with the need to find new ways to maintain strong international science contacts.
Phil Bucksbaum, APS President
Kate Kirby, APS CEO

Monday, February 24, 2020

BAHFest 2020 at Rice, Sunday March 8 UPDATE: postponed.

Update:  This event is going to be postponed until the fall semester.

For those in the Houston area:

Spread the word - 

Created by SMBC's Zach WeinersmithBAHFest is a celebration of well-argued and thoroughly researched but completely incorrect scientific theory. Our brave speakers present their bad theories in front of a live audience and a panel of judges with real science credentials, who together determine who takes home the coveted BAHFest trophy. And eternal glory, of course. If you'd like to learn more about the event, you can check out these articles from the Wall Street Journal and NPR's Science Friday

Here are some examples from past shows:

Our keynote for this year's event is the hilarious Phil Plait (AKA the Bad Astronomer)! Phil will be doing a book signing of his book "Death from the Skies" before and after the show. 

The event is brought to you by BAHFest, and the graduate students in Rice University's Department of BioSciences. Click here for more information about the show, including how to purchase tickets. We hope to see you there! 

[Full disclosure:  I am one of the judges at this year's event.]

Saturday, February 22, 2020

Brief items

As we head out of a very intense week here and toward the March APS meeting, a few brief items:

  • Speaking of the March Meeting, I hear (unofficially) that travel restrictions due to the coronavirus have made a big dent - over 500 talks may be vacant, and the program committee is working hard to explore options for remote presentation.  (For the record, I fully endorse the suggestion that all vacant talks be delivered in the form of interpretive dance by Greg Boebinger.)
  • There will be many talks about twisted bilayers of various 2D materials at the meeting, and on that note, this PRL (arxiv version here) shows evidence of "strange metallicity" in magic-angle bilayer graphene at temperatures above the correlated insulator state(s).
  • Following indirectly on my post about condensed matter and Christmas lights, I want to point out another example of how condensed matter physics (in the form of semiconductor physics and the light emitting diode) has changed the world for the better in ways that could never have been anticipated.  This video shows and this article discusses the new film-making technique pioneered in the making of The Mandalorian.  Thanks to the development of organic LED displays, infrared LEDs for motion tracking, and lots of processing power, it is possible to create a floor-to-ceiling wraparound high definition electronic backdrop.  It's reconfigurable in real time, produces realistic lighting on the actors and props, and will make a lot of green screen compositing obsolete.  Condensed matter:  This is The Way.
  • Superconducting gravimeters have been used to check to see if there are compact objects (e.g., hunks of dark matter, or perhaps microscopic black holes) orbiting inside the earth.  I remember reading about this issue while in college.  Wild creative idea of the day:  Maybe we should use helioseismology to try to infer whether there are any such objects orbiting inside the sun....

Thursday, February 13, 2020

Film boiling and the Leidenfrost point

While setting up my eddy current bounce demonstration, I was able to film some other fun physics.

Heat transfer and two-phase (liquid+gas) fluid flow is a complicated business that has occupied the time of many scientists and engineers for decades.  A liquid that is boiling at a given pressure is pinned to a particular temperature - that's the way the first-order liquid-vapor transition works.  Water at atmospheric pressure boils at 100 C; adding energy to the liquid water at 100 C via heat transfer converts water into vapor rather than increasing the temperature of the liquid.  

Here we are using liquid nitrogen (LN2), which boils at 77 K = -196 C at atmospheric pressure, and are trying to cool a piece of copper plate that initially started out much warmer than that.  When the temperature difference between the copper and the LN2 is sufficiently high, there is a large heat flux that creates a layer of nitrogen vapor between the copper and the liquid.  This is called film boiling.   You've seen this in practice if you've ever put a droplet of water into a really hot skillet, or dumped some LN2 on the floor.  The droplet slides around with very low friction because it is supported by that vapor layer.  

Once the temperature difference between the copper and the LN2 becomes small, the heat flux is no longer sufficient to support film boiling (the Leidenfrost point), and the vapor layer collapses - that brings more liquid into direct contact with the copper, leading to more vigorous boiling and agitation.  That happens at about 45 seconds into the video.  Then, once the copper is finally at the same temperature as the liquid, boiling ceases and everything gets calm.  

For a more technical discussion of this, see here.  It's written up on a site about nuclear power because water-based heat exchangers are a key component of multiple power generation technologies.  

Tuesday, February 11, 2020

Eddy currents - bouncing a magnet in mid-air

Changing a magnetic field that permeates a conductor like a metal will generate eddy currents.  This is called induction, and it was discovered by Michael Faraday nearly 200 years ago.   If you move a ferromagnet near a conductor, the changing field produces eddy currents and those eddy currents create their own magnetic fields, exerting forces back on the magnet.  Here is a rather dramatic demo of this phenomenon, shamelessly stolen by me from my thesis adviser.

In the video, you can watch in slow motion as I drop a strong NdFe14B2 magnet from about 15 cm above a 2 cm thick copper plate.  The plate is oxygen-free, high-purity copper, and it has been cooled to liquid nitrogen temperatures (77 K = -196 C).   That cooling suppresses lattice vibrations and increases the conductivity of the copper by around a factor of 20 compared with room temperature.  (If cooled to liquid helium temperatures, 4.2 K, the conductivity of this kind of copper goes up to something like 200 times its room temperature value, and is limited by residual scattering from crystalline grain boundaries and impurities.)

As the magnet falls, the magnetic flux \(\Phi\) through the copper increases, generating a circumferential electromotive force and driving eddy currents.  Those eddy currents produce a magnetic field directed to repel the falling magnet.  The currents become large enough that the resulting upward force becomes strong enough to bring the magnet to a halt about 2 cm above the copper (!).  At that instant, \(d\Phi/dt = 0\), so the inductive EMF is zero.  However, the existing currents keep going because of the inductance of the copper.  (Treating the metal like an inductor-resistor circuit, the timescale for the current to decay is \(L/R\), and \(R\) is quite small.)  Those continuing currents generate magnetic fields that keep pushing up on the magnet, making it continue to accelerate upward.  The magnet bounces "in mid air".  Of course, the copper isn't a perfect conductor, so much of the energy is "lost" to resistively heating the copper, and the magnet gradually settles onto the plate.  If you try this at room temperature, the magnet clunks into the copper, because the copper conductivity is worse and the eddy currents decay so rapidly that the repulsive force is insufficient to bounce the magnet before it hits the plate.

(Later I'll make a follow-up post about other neat physics that happens while setting up this demo.)



Sunday, February 09, 2020

Updated: Advice on choosing a grad school

I realized it's been several years since I've run a version of this, and it's the right season....

This is written on the assumption that you have already decided, after careful consideration, that you want to get an advanced degree (in physics, though much of this applies to any other science or engineering discipline).  This might mean that you are thinking about going into academia, or it might mean that you realize such a degree will help prepare you for a higher paying technical job outside academia.  Either way,  I'm not trying to argue the merits of a graduate degree.
  • It's ok at the applicant stage not to know exactly what you want to do.  While some prospective grad students are completely sure of their interests, that's more the exception than the rule.  I do think it's good to have narrowed things down a bit, though.  If a school asks for your area of interest from among some palette of choices, try to pick one (rather than going with "undecided").  We all know that this represents a best estimate, not a rigid commitment.
  • If you get the opportunity to visit a school, you should go.  A visit gives you a chance to see a place, get a subconscious sense of the environment (a "gut" reaction), and most importantly, an opportunity to talk to current graduate students.  Always talk to current graduate students if you get the chance - they're the ones who really know the score.  A professor should always be able to make their work sound interesting, but grad students can tell you what a place is really like.
  • International students may have a very challenging time being able to visit schools in the US, between the expense (many schools can help defray costs a little but cannot afford to pay for airfare for trans-oceanic travel) and visa challenges.  Trying to arrange skype discussions with people at the school is a possibility, but that can also be challenging.  I understand that this constraint tends to push international students toward making decisions based heavily on reputation rather than up-close information.  
  • Picking an advisor and thesis area are major decisions, but it's important to realize that those decisions do not define you for the whole rest of your career.  I would guess (and if someone had real numbers on this, please post a comment) that the very large majority of science and engineering PhDs end up spending most of their careers working on topics and problems distinct from their theses.  Your eventual employer is most likely going to be paying for your ability to think critically, structure big problems into manageable smaller ones, and knowing how to do research, rather than the particular detailed technical knowledge from your doctoral thesis.  A personal anecdote:  I did my graduate work on the ultralow temperature properties of amorphous insulators.  I no longer work at ultralow temperatures, and I don't study glasses either; nonetheless, I learned a huge amount in grad school about the process of research that I apply all the time.
  • Always go someplace where there is more than one faculty member with whom you might want to work.  Even if you are 100% certain that you want to work with Prof. Smith, and that the feeling is mutual, you never know what could happen, in terms of money, circumstances, etc.  Moreover, in grad school you will learn a lot from your fellow students and other faculty.  An institution with many interesting things happening will be a more stimulating intellectual environment, and that's not a small issue.
  • You should not go to grad school because you're not sure what else to do with yourself.  You should not go into research if you will only be satisfied by a Nobel Prize.  In both of those cases, you are likely to be unhappy during grad school.  
  • I know grad student stipends are low, believe me.  However, it's a bad idea to make a grad school decision based purely on a financial difference of a few hundred or a thousand dollars a year.  Different places have vastly different costs of living - look into this.  Stanford's stipends are profoundly affected by the cost of housing near Palo Alto and are not an expression of generosity.  Pick a place for the right reasons.
  • Likewise, while everyone wants a pleasant environment, picking a grad school largely based on the weather is silly.
  • Pursue external fellowships if given the opportunity.  It's always nice to have your own money and not be tied strongly to the funding constraints of the faculty, if possible.  (It's been brought to my attention that at some public institutions the kind of health insurance you get can be complicated by such fellowships.  In general, I still think fellowships are very good if you can get them.)
  • Be mindful of how departments and programs are run.  Is the program well organized?  What is a reasonable timetable for progress?  How are advisors selected, and when does that happen?  Who sets the stipends?  What are TA duties and expectations like?  Are there qualifying exams?  Where have graduates of that department gone after the degree?  Know what you're getting into!  Very often, information like this is available now in downloadable graduate program handbooks linked from program webpages.   
  • It's fine to try to communicate with professors at all stages of the process.  We'd much rather have you ask questions than the alternative.  If you don't get a quick response to an email, it's almost certainly due to busy-ness, and not a deeply meaningful decision by the faculty member.  For a sense of perspective:  even before I was chair, I would get 50+ emails per day of various kinds not counting all the obvious spam that gets filtered. 
There is no question that far more information is now available to would-be graduate students than at any time in the past.  Use it.  Look at departmental web pages, look at individual faculty member web pages.  Make an informed decision.  Good luck!