Search This Blog

Saturday, July 25, 2020

Kitchen science: insulated cups

An impromptu science experiment this morning.  A few months ago we acquired some very nice insulated tumblers (initially from causebox and then more from here).  Like all such insulated items, the inner and outer walls are made from a comparatively lousy thermal conductor, in this case stainless steel.  (Steel is an alloy, and the disorder in its micro and nanoscale structure scatters electrons, making it have a lower electrical (and hence thermal) conductivity than pure metals.)  Ideally the walls only touch at the very top lip of the cup where they are joined, and the space between the walls has been evacuated to minimize heat conduction by any trapped gas in there.  When working well, so that heat transfer has to take place along the thin metal wall, the interior wall of the cup tends to sit very close to the temperature of whatever liquid is in there, and the exterior wall tends to sit at room temperature.

We accidentally dropped one of the cups this morning, making a dent near the base.  The question was, did this affect the thermal insulation of that cup?  To test this, we put four ice cubes and four ounces of water from our refrigerator into each cup and let them sit on the counter for 15 minutes.  Then we used an optical kitchen thermometer (with handy diode laser for pointing accuracy) to look at the exterior and interior wall temperatures.  (Apologies for the use of Fahrenheit units.)  Check this out.


The tumbler on the left is clearly doing a better job of keeping the outside warm and the inside cold.  If we then scrutinize the tumbler on the right we find the dent, which must be deep enough to bring the inner and outer walls barely into contact.


The bottom line:  Behold, science works.  Good insulated cups are pretty impressive engineering, but you really should be careful with them, because the layers really are close together and can be damaged.

Thursday, July 23, 2020

Recently in the arxiv - van der Waals interfaces

Sometimes when looking at the pace of results coming out of the 2D material community, I am reminded of an old joke from Tom Lehrer about super-productive people:  "It's people like that who make you realize how little you've accomplished. It's a sobering thought, for example, that, when Mozart was my age, he had been dead for two years." (See here and then listen to the whole album - National Brotherhood Week has particular resonance this year.).

Recently in the arxiv, there were two different back-to-back preprint pairs uploaded by extremely strong collaborations in the trade of creating new condensed matter systems at the interfaces of stacked van der Waals materials (systems like graphene and mica, that can be exfoliated down to atomically thin layers).  

The first pair of papers (and my apologies if I missed others) were this one and this one.  In the former, the investigators take advantage of the energies of the bands in \(\alpha\)-RuCl3, and find that when it is layered stacked with various 2D materials (graphene, bilayer graphene, WSe2, electrons are spontaneously transferred from the 2D materials to the \(\alpha\)-RuCl3 (The normally empty conduction band of \(\alpha\)-RuCl3 lies at lower energy than the top of the valence band of the 2D material.)  This leads to very high hole concentrations within the graphene (etc.), with comparatively minimal disorder, reminiscent of modulation doping, the technique used to achieve outstanding charge mobility in 2D electron and hole gases.  The latter paper is complementary to the former:  the investigators use near-field optical techniques to look at both the plasmon properties of the graphene in such structures, and can back out the optical conductivity of the now-electron-doped \(\alpha\)-RuCl3.

The second pair of papers, this one and this one, show a whole hierarchy of insulating states that appear in moire bilayer structures made from twisted WS2/WSe2 bilayers.  As I've written before, putting together close but not identical lattices and/or twisting one layer relative to another leads to a moire pattern, and therefore superlattice for charge carriers at that interface.  Both groups find (the first using optical methods, the second using microwave techniques) that for a large number of rational fraction ratios between the number of charge carriers and the number of lattice sites, the system is very strongly insulating.  Each insulating state corresponds to a particular periodic arrangement of the charge carriers, trying to stay generally as far away from each other as possible to minimize their potential energy.  These can be analogous to Wigner crystals and charge density waves.

Very cool stuff.

Wednesday, July 22, 2020

APS Division of Condensed Matter Physics invited symposium nominations


While no one knows right now whether the 2021 March Meeting will be in person, online, or some hybrid form next spring, now is the time to put in your nominations for invited speakers and symposia for the Division of Condensed Matter Physics.  The deadline to nominate is August 17.  The whole community benefits from high quality invited talks, so if you're in a position to do this, please think about it.


To Members of the Division of Condensed Matter Physics:

APS is now accepting invited speaker and invited symposium nominations for the March Meeting in 2021. Here is the link to the APS website for submitting nominations.

The Meeting is planned for March 15 to 19, 2021. Join more than 11,000 physicists attending, presenting, and networking at the APS March Meeting 2021. Showcase your work to a global audience of physicists, scientists, and students representing APS units and committees and explore groundbreaking research from industry, academia, and major labs.

Note that the decision regarding a virtual or in-person March Meeting will be made later this summer.

Jim Sauls
DCMP Secretary/Treasurer

Monday, July 13, 2020

Quantum Coffeehouse and other physics videos

Who doesn't need more videos to watch these days?

Erica Carlson has started a new Quantum Coffeehouse video series, including interviews with practicing physicists (including yours truly).  She had also presented a "Great Course", "Understanding the Quantum World".

I'm also a fan of Physics Girl.  I really liked her recent video with supercooled sodium acetate.

Minute Physics is truly outstanding, including their look at N95 masks and how they use electrets to gather and trap polarizable particles.

Andrew Dotson is reliably funny and insightful.

For the musically inclined, acapellascience is engaging, including their particularly timely William Rowan Hamilton.

I also have to plug my colleague Jason Hafner's channel, which netted him an on-screen appearance in the new movie Palm Springs.  Also making an appearance in the movie is Jim Freericks' edx course, Quantum Mechanics for Everyone.




Wednesday, July 08, 2020

Brief items - updated

Some further items of note:
  • There is great anxiety and frustration over the latest pronouncement from DHS/ICE about international students in the US.  Let me give a little context.  For many years there has been a rule that international students studying in the US can take no more than 3 credits (or equivalent) per semester of purely online instruction. The point of that was to prevent many people from applying for F visas and then "studying" at online-only diploma mills while actually working. That is, it was originally a policy meant to encourage that student visas go to legitimate international students and scholars pursuing degrees at accredited universities.  In the spring when the pandemic hit and many universities transitioned to online instruction in the middle of the semester, DHS granted a waiver on this requirement.  Well, now they are trying to rescind that, and are doing so in a particularly draconian way: As written, if a university goes online-only, either from the start of the semester or even partway through due to public health concerns, the international students would face having to leave the US on short notice.   This is a terrible, stupid, short-sighted way to handle this situation, and it doesn't remotely serve the best interests of any constituency (student, university, or country).  Unsurprisingly, many many organizations are pushing back against this.  Hopefully there will be changes and/or workarounds.  UPDATE:  The administration appears to have backed down from this.  Hopefully that will stick.
  • On to science.  Quanta has an article about the origins of the rigidity of glass.  The discussion there is about whether there is a kind of hidden structural order in the glassy material.  Fundamentally (as I've written previously), rigidity in any solid results from a combination of very slow timescales for atomic motion (due to lack of thermal energy available to overcome "barriers") and the Pauli principle giving a hard-core repulsion between atoms.  Still, the question of the underlying nature of glassy systems remains fascinating.
  • The 2D materials experts at Columbia have shown clean fractional quantum Hall physics in a monolayer of WSe<sub>2</sub>.  The actual paper is here.  I have yet to come up with a really nice, generally accessible write-up of the FQH effect. The super short version:  Confine charge carriers in strictly two dimensions, and throw in a large magnetic field perpendicular to the plane (such that the energy associated with cyclotron motion dominates the kinetic energy). At certain ratios of magnetic field to number of charge carriers, the charge carriers can condense into new collective states (generally distinguished by topology rather than broken symmetries like the liquid-gas or nonmagnetic/ferromagnetic phase transitions).  The fractional quantum Hall states can have all sorts of unusual properties, but the key point here is that they are fragile.  Too much disorder (like missing atoms or charged impurities), and the energy associated with that disorder can swamp out the energy savings of condensing into such a state.  It's remarkable that the material quality of the monolayer transition metal dichalcogenide (and its encapsulating boron nitride surroundings) is so high.  Seeing how FQH states evolve in this example new material system with rich band structure should be interesting.
  • I feel bad for only now learning about this great series of talks about the state of the art in spintronics, trying to understand, engineer, and control the motion of spin.
  • For your animal video needs, get the behind-the-scenes story about Olive and Mabel here.

Tuesday, June 30, 2020

How do hot electrons get hot?

We have a paper that came out today that was very fun.  It's been known for a long time that if you apply a sufficiently large voltage \(V\) to a tunnel junction, it is possible to get light emission, as I discussed here a bit over a year ago, and as is shown at the right.  Conventionally, the energy of the emitted photons \(\hbar \omega\) is less than \(eV\) (give or take the thermal energy scale \(k_{\mathrm{B}}T\) ) if the idea is that single-electron processes are all that can happen.  

In this new paper looking at planar metal tunnel junctions, we see several neat things:
  • The emitted spectra look like thermal radiation with some effective temperature for the electrons and holes \(T_{\mathrm{eff}}\), emitted into a device-specific spectral shape and polarization (the density of states for photons doesn't look like that of free space, because the plasmon resonances in the metal modify the emission, an optical antenna effect).   
    Once the effective temperature is taken into account, the raw spectra (left)
    all collapse onto a single shape for a given device.

  • That temperature \(T_{\mathrm{eff}}\) depends linearly on the applied voltage, when looking at a whole big ensemble of devices.  This is different than what others have previously seen.  That temperature, describing a steady-state nonequilibrium tail of the electronic distribution local to the nanoscale gap, can be really high, 2000 K, much higher than that experienced by the atoms in the lattice.
  • In a material with really good plasmonic properties, it is possible to have almost all of the emitted light come out at energies larger than \(eV\) (as in the spectra above).  That doesn't mean we're breaking conservation of energy, but it does mean that the emission process is a multi-electron one.  Basically, at comparatively high currents, a new hot carrier is generated before the energy from the last (or last few) hot carriers has had a chance to leave the vicinity (either by carrier diffusion or dumping energy to the lattice). 
  • We find that the plasmonic properties matter immensely, with the number of photons out per tunneling electron being 10000\(\times\) larger for pure Au (a good plasmonic material) than for Pd (a poor plasmonic material in this enegy range).  
That last point is a major clue.  As we discuss in the paper, we think this implies that plasmons don't just couple the light out efficiently.  Rather, the plasmons also play a key role in generating the hot nonequilibrium carriers themselves.   The idea is that tunneling carriers don't just fly through - they can excite local plasmon modes most of which almost immediately decay into hot electron/hole excitations with energies up to \(eV\) away from the Fermi level.  Hot carriers are potentially useful for a lot of things, including chemistry.  I'm also interested in whether some fun quantum optical effects can take place in these extreme nanoscale light sources.  Lots to do!

Saturday, June 27, 2020

Brief items

Some science items that crossed my path that you may find interesting:
  • This article at Quanta is a nice look at the Ising model for a general audience.  When I took graduate statistical mechanics from Lenny Susskind, he told the story of Lars Onsager just casually mentioning on the middle of a conference talk that Onsager had solved the 2D Ising model exactly.
  • If you have any interest in the modern history of advanced transistors, the special FinFET ones that are now the mainstays of ultrascaled high performance processors, you might find this article to be fun.
  • With all the talk about twisted bilayers of van der Waals materials for exotic electronic properties, it’s cool to see this paper, which looks at the various nonlinear optical processes that can be enabled in similar structures.  Broken structural symmetries are the key to allowing certain nonlinear processes, and the moire plus twist approach is quite the playground.
  • This preprint is very cool, where the authors have made basically an interferometer in the fractional quantum Hall regime for electrons confined in 2D, and can show clean results that demonstrate nontrivial statistics.  The aspect of this that I think is hard for non-experimentalists to appreciate is how challenging it is to create a device like this that is so clean - the fractional quantum Hall states are delicate, and it is an art form to create devices to manipulate them without disorder or other problems swamping what you want to measure.
Coming at some point, a post or two about my own research.

Wednesday, June 24, 2020

A nation of immigrants

Real life has been intruding rudely on my blogging time.  I will try to step up, but nothing seems to be slowing down this summer.

I sense from the comments on my last post that there is some demand to talk about US immigration policy as it pertains to the scientific community (undergraduate and graduate students, postdocs, scholars, faculty members).  I've been doing what little I can to try to push back against what's going on.  I think the US has benefited enormously from being a training destination for many of the world's scientists and engineers - the positive returns to the country overall and the economy have been almost unquantifiably large.  Current policies seem to me to be completely self-defeating.  As I wrote over three years ago alluding to budget cuts (which thankfully Congress never implemented), there is hysteresis and an entropic component in policy-making.  It's depressingly easy to break things that can be very difficult to repair.  Using immigration policy to push away the world's scientists and engineers from the US is a terrible mistake that runs the risk of decades of long-term negative consequences.

Monday, June 15, 2020

The foil electret microphone

Pivoting back toward science by way of technology.... Some very large fraction of the microphones out there in electronic gadgets are based on electrets.  An electret is an insulating material with a locked-in electrical polarization - for example, take a molten or solvated polymer, embed highly polar molecules in there, and solidify in the presence of a large polarizing electric field.  The electrical polarization means that there is an effective surface charge density.  You can make that electret into a free-standing foil or a film coating a backing to make a diaphragm.  When that film vibrates, it will generate an oscillating voltage on a nearby electrode (which could, say, be the gate electrode of a field-effect transistor).  Voila - a microphone that is simple, readily manufacturable, and doesn't need an external power supply.  

While electret microphones are losing some marketshare to microelectromechanical ones in things like airpods, they've played a huge part in now ubiquitous phone and acoustic technologies in the late 20th and early 21st centuries.  When I was a postdoc I was fortunate one day to meet their coinventor, James West, who was still at Bell Labs, when (if I recall correctly) his summer student gave a presentation on some lead-free ultra-adhesive solder they were working on.  He was still patenting inventions within the last two years, in his late 80s - impressive!

Monday, June 08, 2020

Change is a depressingly long time in coming.

People don't read this blog for moralizing, and I surely don't have any particular standing, but staying silent out of concern for saying the wrong thing isn't tenable.  Black lives matter.  There is no more stark reminder of the depressingly long timescales for social progress than the long shadow cast by the US history of slavery.  I have to hope that together we can make lasting change - the scale of the outpouring in the last week has to be a positive sign.  The AAAS announced that on Wednesday June 10 they will be "observing #shutdownSTEM, listening to members of our community who are sharing resources and discussing ways to eliminate racism and make STEM more inclusive of Black people. www.shutdownstem.com. We encourage you to join us."  It's a start.

Wednesday, June 03, 2020

Non-academic careers and physics PhDs

With so many large-scale events happening right now (the pandemic, resulting economic displacement, the awful killing of George Floyd and resulting protests and unrest, federal moves regarding international students), it's hard not to feel like blogging is a comparatively self-indulgent activity.  Still, it is a way to try to restore a feeling of normalcy.  

The Pizza Perusing Physicist had asked, in this comment, if I could offer any guidance about non-academic careers for physics PhDs (including specific fields and career paths), beyond cliches about how PhD skills are valued by many employers.  I don't have any enormous font of wisdom on which to draw, but I do have a few points:
  • I do strongly recommend reading A PhD is Not Enough.  It's a bit older now, but has good insights.
  • It is interesting to look at statistics on where people actually land.  According to the AIP, about a half of physics PhDs take initial academic jobs (postdocs and others); a third go to the private sector; and 14% go to government positions.  Similarly, you can see the skills that recent PhDs say they use in their jobs.  
  • I found it particularly interesting to read the comments from people ten years out from their degrees, since they have some greater perspective - seriously, check out that document.
  • Those latter two AIP documents show why "PhD skills are valued by employers" has become cliched - it's true.
  • In terms of non-academic career options for physics PhDs, there really are a wide variety, though like any career trajectory a great deal depends on the skills, flexibility, and foresight of the person.  Technical problem solving is a skill that a PhD should have learned - how to break big problems up into smaller ones, how to consider alternatives and come up with ways to test those, etc.  There is a often a blurry line between physics and some types of engineering, and it is not uncommon for physics doctorates to get jobs at companies that design and manufacture stuff - as a condensed matter person, I have known people who have gone to work at places like Intel, Motorola, Seagate, National Instruments, Keysight (formerly Agilent), Northrup Grumman, Lockheed, Boeing, etc.  It is true that it can be hard to get your foot in the door and even know what options are available.  I wish I had some silver bullet on this, but your best bets are research (into job openings), networking, and career fairs including at professional conferences.  Startups are also a possibility, though those come with their own risks.  Bear in mind that your detailed technical knowledge might not be what companies are looking for - I have seen experimentalist doctoral students go be very successful doing large-scale data analysis for oil services firms, for example.  Likewise, many people in the bioengineering and medical instrumentation fields have physics backgrounds.
  • If academia isn't for you, start looking around on the side early on.  Get an idea of the choices and a feel for what interests you. 
  • Make sure you're acquiring skills as well as getting your research done.  Learning how to program, how to manipulate and analyze large data sets, statistical methods - these are generally useful, even if the specific techniques evolve rapidly.
  • Communication at all levels is a skill - work at it.  Get practice writing, from very short documents (summarize your research in 150 words so that a non-expert can get a sense of it) to papers to the thesis.  Being able to write and explain yourself is essential in any high level career.  Get practice speaking with comfort, from presentations to more informal 1-on-1 interactions.  Stage presence is a skill, meaning it can be learned.  
  • Don't discount think tanks/analysis firms/patent firms - people who can tell the difference between reality and creative marketing language (whether about products or policies) are greatly valued.
  • Similarly, don't discount public policy or public service.  The fraction of technically skilled people in elected office in the US is woefully small (while the chancellor of Germany has a PhD in quantum chemistry).  These days, governing and policy making would absolutely benefit from an infusion of people who actually know what technology is and how it works, and can tell the difference between an actual study and a press release.
I'm sure more things will occur to me after I publish this.  There is no one-size-fits-all answer, but that's probably a good thing.

Wednesday, May 27, 2020

The National Science and Technology Foundation?

A proposal is being put in front of Congress that would reshape the National Science Foundation into the National Science and Technology Foundation.  The Senate bill is here, and the House equivalent bill is here.  The actual text of the Senate bill is here in pdf form.   In a nutshell, this "Endless Frontiers" bill (so named to echo the Vannevar Bush report that spurred the creation of the NSF in the first place) would do several things, including:
  • Create a Technology Directorate with its own advisory board (distinct from the National Science Board)
  • Would identify ten key technology areas (enumerated in the bill, initially (i) artificial intelligence and machine learning; (ii) high performance computing, semiconductors, and advanced computer hardware; (iii) quantum computing and information systems; (iv) robotics, automation, and advanced manufacturing; (v) natural or anthropogenic disaster prevention; (vi) advanced communications technology; (vii) biotechnology, genomics, and synthetic biology; (viii) cybersecurity, data storage, and data management technologies; (ix) advanced energy; and (x) materials science, engineering, and exploration relevant to the other key technology focus areas)
  • Would have funds allocated by program managers who may use peer review in an advisory role (so, more like DOD than traditional NSF)
  • Invest $100B over 5 years, with the idea that the rest of NSF would also go up, but this new directorate would get the large bulk of the funding
This article at Science does a good job outlining all of this.  The argument is, basically, that the US is lagging in key areas and is not doing a good job translating basic science into technologies that ensure international primacy (with China being the chief perceived rival, though this is unstated in the bills of course).  If this came to pass, and it's a big "if", this could fundamentally alter the character and mission of the NSF.  Seeing bipartisan congressional enthusiasm for boosting funding to the NSF is encouraging, but I think there are real hazards in pushing funding even farther toward applications, particularly in a governance and funding-decision model that would look so different than traditional NSF.  

It's worth noting that people have been having these arguments for a long time.  Here is a 1980 (!) article from Science back when a "National Technology Foundation" proposal was pending before Congress, for exactly the same perceived reasons (poor translation of basic science into technology and business competitiveness, though the Soviets were presumably the rivals with whom people were concerned about competing).  The NSF has their own history that mentions this, and how this tension led to the creation of the modern Engineering Directorate within NSF.  

Interesting times.  Odds are this won't pass, but it's a sign of bipartisan concern about the US falling behind its technological rivals.

Wednesday, May 20, 2020

Yet more brief items

Between writing deadlines, battling with reviewer 3 (I kid, I kid), and trying to get set for the tentative beginnings of restarting on-campus research, it's been a busy time.  I really do hope to do more blogging soon (suggested topics are always appreciated), but for now, here are a few more brief items:
  • This expression of editorial concern about this paper was an unwelcome surprise.  Hopefully all will become clear.  Here is a statement by the quantum information science-related center at Delft.
  • I happened across this press release, pointing out that nVidia's new chip will contain 54 billion transistors (!) fabbed with a 7 nm process.  For reference, the "7 nm" there is a label describing particular fabrication processes using finFETs, and doesn't really correspond to a physical feature size of 7 nm.  I discussed this here before.  Still impressive.
  • There is a lot of talk about moving cutting-edge semiconductor fabrication plants back to the US.  Intel and parts of GlobalFoundries aside, a large fraction of high end chip volume is produced outside the US.  There have long been national security and intellectual property concerns about the overseas manufacturing of key technologies, and the US DOD has decided that bringing some of this capability back on-shore is safer and more secure.  I'm surprised it's taken this long, though the enormous capital cost in setting up a foundry explains why these things are often done by large consortia.  The pandemic has also shown that depending on overseas suppliers for just-in-time delivery of things may not be the smartest move.
  • Speaking of that, I can't help but wonder about the cycle of unintended consequences that we have in our economic choices.  I've ranted (way) before about how the way the stock market and corporate governance function these days has basically squeezed away most industrial basic research.  Those same attitudes gave us "just-in-time" manufacturing and somehow convinced generations of corporate management that simple things like warehouses and stockrooms were inherently bad.  "Why keep a stockroom around, when you can always order a M5 allen head bolt via the internet and get it shipped overnight from hundreds or thousands of miles away?" runs the argument, the same kind of bogus accounting that implies that the continued existence of a space in the Bell Labs parking lot used to cost Lucent $30K/yr.   So, companies got rid of inventory, got rid of local suppliers, and then were smacked hard by the double-whammy of a US-China trade war and a global pandemic.  Now we are being bombarded with breathless stories about how the pandemic and people working from home might mean the complete delocalization of work - a vision of people working from anywhere, especially places more financially sustainable than the Bay Area.  I'm all for telecommuting when it makes sense, and minimizing environmental impact, and affordable places to live.  That being said, it's hard not to feel like a truly extreme adoption of this idea is risky.  What if, heaven forbid, there's a big disruption to the communications grid, such as a Carrington Event?  Wouldn't that basically obliterate the ability of completely delocalized companies to function?  
  • To end on a much lighter note, these videos (1, 2, 3, 4) have been a positive product of the present circumstances, bringing enjoyment to millions.

Sunday, May 10, 2020

Brief items

Apologies for the slowed frequency of posting.  Academic and research duties have been eating a lot of bandwidth.  Here are a few items that may be of interest:

  • This article about Shoucheng Zhang is informative, but at the same time very sad.  Any geopolitics aside, he was an intense, driven person who put enormous pressure on himself.  It says something about self-perception under depression that he was concerned that he was somehow not being recognized.  
  • This paper caught my eye.  If you want to see whether there is some dependence of electronic conduction on the relative directions of a material's crystal axes and the current, it makes sense to fabricate a series of devices oriented in different directions.  These authors take a single epitaxial film of a material (in this case the unconventional superconductor Sr2RuO4) and carve it into a radial array of differently oriented strips of material with measurement electrodes.   They find that there do seem to be "easy" and "hard" directions for transport in the normal state that don't have an obvious relationship to the crystal symmetry directions.  A similar approach was taken here in a cuprate superconductor.  
  • I like the idea of making characterization tools broadly available for low cost - it's great for the developing world and potentially for use in public secondary education.  This work shows plans for a readily producible optical microscope that can have digital imaging, motorized sample positioning, and focusing for a couple of hundred dollars.  Fancier than the foldscope, but still very cool.  Time to think more about how someone could make a $100 electron microscope....
  • Here is a nice review article from the beginning of the year about spin liquids.
  • I was going to point out this article about ultralow temperature nanoelectronics back in March, but the pandemic distracted me.  From grad school I have a history in this area, and the progress is nice to see.  The technical challenges of truly getting electrons cold are formidable.

Thursday, April 30, 2020

On the flexural rigidity of a slice of pizza

People who eat pizza (not the deep dish casserole style from Chicago, but normal pizza), unbeknownst to most of them, have developed an intuition for a key concept in elasticity and solid mechanics. 

I hope that all right-thinking people agree that pizza slice droop (left hand image) is a problem to be avoided.  Cheese, sauce, and toppings are all in serious danger of sliding off the slice and into the diner's lap if the tip of the slice flops down.  Why does the slice tend to droop?   If you hold the edge of the crust and try to "cantilever" the slice out into space, the weight of the sauce/toppings exerts downward force, and therefore a torque that tries to droop the crust.  

A simple way to avoid this problem is shown in the right-hand image (shamelessly stolen from here).  By bending the pizza slice, with a radius of curvature around an axis that runs from the crust to the slice tip, the same pizza slice becomes much stiffer against bending.   Why does this work?  Despite what the Perimeter Institute says here, I really don't think that differential geometry has much to do with this problem, except in the sense that there are constraints on what the crust can do if its volume is approximately conserved.  

The reason the curved pizza slice is stiffer turns out to be the same reason that an I-beam is stiffer than a square rod of the same cross-sectional area.  Imagine an I-beam with a heavy weight (its own, for example) that would tend to make it droop.  In drooping a tiny bit, the top of the I-beam would get stretched out, elongated along the \(z\) direction - it would be in tension.  The bottom of the I-beam would get squeezed, contracted along the \(z\) direction - it would be in compression.  Somewhere in the middle, the "neutral axis", the material would be neither stretched nor squeezed.  We can pick coordinates such that the line \(y=0\) is the neutral axis, and in the linear limit, the amount of stretching (strain) at a distance \(y\) away from the neutral axis would just be proportional to \(y\).  In the world of linear elasticity, the amount of restoring force per unit area ("normal stress") exhibited by the material is directly proportional to the amount of strain, so the normal stress \(\sigma_{zz} \propto y\).  If we add up all the little contributions of patches of area \(\mathrm{d}A\) to the restoring torque around the neutral axis, we get something proportional to \(\int y^2 \mathrm{d}A\).  The bottom line:  All other things being equal, "beams" with cross-sectional area far away from the neutral axis resist bending torques more than beams with area close to the neutral axis.

Now think of the pizza slice as a beam.  (We will approximate the pizza crust as a homogeneous elastic solid - not crazy, though really it's some kind of mechanical metamaterial carbohydrate foam.)  When the pizza slice is flat, the farthest that some constituent bit of crust can be from the neutral axis is half the thickness of the crust.  When the pizza slice is curved, however, much more of its area is farther from the neutral axis - the curved slice will then resist bending much better, even made from the same thickness of starchy goodness as the flat slice.

(Behold the benefits of my engineering education.) 

Wednesday, April 22, 2020

Brief items

A few more links to help fill the time:
  • Steve Simon at Oxford has put his graduate solid state course lectures online on youtube, linked from here.  I'd previously linked to his undergrad solid state lectures.   Good stuff, and often containing fun historical anecdotes that I hadn't known before.
  • Nature last week had this paper demonstrating operations of Si quantum dot-based qubits at 1.5 K with some decent fidelity.  Neat, showing that long electron spin coherence times are indeed realizable in these structures at comparatively torrid conditions.
  • Speaking of quantum computing, it was reported that John Martinis is stepping down as lead of google's superconducting quantum computation effort (these folks).  I've always thought of him as an absolutely fearless experimentalist, and while no one is indispensable, his departure leads me to lower my expectations about google's progress.  Update:  Forbes has a detailed interview with Martinis about this.  It's a very interesting inside look.  
  • I'd never heard of "the Poynting effect" before, and I thought this write-up was very nice.

Sunday, April 19, 2020

This week in the arxiv - magnons

Ages ago I wrote a description of magnons, which I really should revise.  The ultra-short version:  Magnetically ordered materials are classified by long-ranged patterns of how electronic spins are arranged.  For example, in a (single domain) ferromagnet, the spins all point in the same direction, and it costs energy to perturb that arrangement by tipping a spin.  Classically one can define spin waves, where there is some spatially periodic perturbation of the spin orientation (described by some wave vector \(\mathbf{k}\), and that perturbation then oscillates in time with frequency \(\omega(\mathbf{k})\), like any of a large number of wave-like phenomena.  In the quantum limit, one can talk about the energy of exciting a single magnon, \(\hbar \omega\).  One can use this language to about making wavepackets and propagating magnons to transport angular momentum.

Two papers appeared on the arxiv this week, back-to-back, taking this to the next level.  In condensed matter physics some of the most powerful techniques, in terms of learning about material properties and how they emerge, involve scattering.  That is, taking some probe (say visible light, x-rays, electrons, or neutrons) with a well-defined energy and momentum, firing it at a target of interest, and studying the scattered waves to learn about the target.  A related approach involves interferometery, where the propagation of waves (detected through changes in amplitude and phase) is sensitive to the local environment.  

The two preprints (this one and this one) establish that it is now possible to use magnons in both approaches.  This will likely open up a new route for characterizing and understanding micro- and nanoscale magnetic materials, which will be extremely useful (since, as I had to explain to a referee on a paper several years ago, it's actually not possible to use neutron scattering to probe a few-micron wide, few nm thick piece of material.)  In the former paper, magnons in yttrium iron garnet (a magnetic insulator called YIG, not to be confused with Yig, the Father of Serpents) are launched toward and scattered from a patch of permalloy film, and the scattered waves are detected and imaged sensitively.  In the latter, propagation and interference of magnons in YIG waveguides is imaged.  The great enabling technology for both of these impressive experiments has been the development over the last decade or so in the use of nitrogen-vacancy centers in diamond as incredibly sensitive magnetometers.   Very pretty stuff.


Sunday, April 12, 2020

What are anyons?

Because of the time lag associated with scientific publishing, there are a number of cool condensed matter results coming out now in the midst of the coronavirus impact.  One in particular prompted me to try writing up something brief about anyons aimed at non-experts.  The wikipedia article is pretty good, but what the heck.

One of the subtlest concepts in physics is the idea of "indistinguishable particles".  The basic idea seems simple.  Two electrons, for example, are supposed to be indistinguishable.  There is no measurement you could do on two electrons that would find different properties (say size or charge or response to magnetic fields).  For example, I should be able to pop an electron out of a hydrogen atom and replace it with any other electron, and literally no measurement you could do would be able to tell the difference between the hydrogen atoms before and after such a swap.  The consequences of true indistinguishability are far reaching even in classic physics.  In statistical mechanics, whether or not a collection of particles and that same collection with two particles swapped are really the same microscopic state is a big deal, with testable consequences.

In quantum mechanics, the situation is richer.  Let's imagine that the only parameter that matters is position.  (We are going to use position as shorthand to represent all of the quantum numbers associated with some particle.)  We can describe a two-particle system by some "state vector" (or wavefunction if you prefer) \( | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle\), where the first vector is the position of particle 1 and the second is the position of particle 2.  Now imagine swapping the two particles.   After the swap, the state should be \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle\).  The question is, how does that second state relate to the first state?  If the particles are truly indistinguishable, you'd think \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle =  | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \). 

It turns out that that's not the only allowed situation.  One thing that must be true is that swapping the particles can't change the total normalization of the state (how much total stuff there is).  That restriction is written  \( \langle \psi (\mathbf{r_{2}},\mathbf{r_{1}}) | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle = \langle \psi (\mathbf{r_{1}},\mathbf{r_{2}}) | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \).  If that's the most general restriction, then we can have other possibilities than the states before and after being identical.

For bosons, particles obeying Bose-Einstein statistics, the simple, intuitive situation does hold.  \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle =  | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \).

For fermions, particles obeying Fermi-Dirac statistics, instead  \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle = -  | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \).  This also preserves normalization, but has truly world-altering consequences.  This can only be satisfied for two particles at the same position if the state is identically zero.  This is what leads to the Pauli Principle and basically the existence of atoms and matter as we know them.

In principle, you could have something more general than that.  For so-called "abelian anyons", you could have the situation  \( | \psi (\mathbf{r_{2}},\mathbf{r_{1}}) \rangle = (e^{i \alpha})  | \psi (\mathbf{r_{1}},\mathbf{r_{2}}) \rangle \), where \(\alpha\) is some phase angle.  Then bosons are the special case where \(\alpha = 0\) or some integer multiple of \(2 \pi\), and fermions are the special case where \(\alpha = \pi\) or some odd multiple of \(\pi\).   

You might wonder, how would you ever pick up weird phase angles when particles are swapped in position?  This situation can arise for charged particles restricted to two dimensions in the presence of a magnetic field. The reason is rather technical, but it comes down to the fact that the vector potential \(\mathbf{A}\) leads to complex phase factors like the one above for charged particles.  

This brings me to this paper.  Anyons have been deeply involved in describing the physics of the fractional quantum Hall effect for a long time  (see here for example).  It's tricky to get direct experimental evidence for the unusual phase factor, though.  The authors of this new paper have been basically doing a form of particle swapping via a scattering experiment, and looking at correlations in where the particles end up (via the noise, fluctuations in the relative currents).  They do indeed see what looks like nice evidence for the expected anyonic properties of a particular quantum Hall state.  

(There are also "nonabelian" anyons, but that is for another time.)

Saturday, April 04, 2020

Brief items

A couple of interesting links:

  • From City University of New York, a paper on a bit of the physics relevant to the pandemic - specifically the issue of aerosolized droplets and air circulation in rooms.  The conclusion is that, based on common convection patterns, the best approach to clearing airborne contaminants is a ceiling-mounted suction filter as in surgical operating rooms.  (I suspect that vertical flow ceiling HEPA fan filter units with many air changes per hour as in microfabrication cleanrooms would also work, but it's not like anyone is going to install elevated, gridded flooring everywhere.)  Some of the basic physics of particle suspension is simple enough to teach to high school students, without even getting into viscosity and drag anf real fluid mechanics.  The typical amount of kinetic energy that a would-be suspended particle picks up in collisions with its surroundings is on the order of \(k_{\mathrm{B}}T\), or about 26 meV (\(4.14 \times 10^{-21}\) J).  For a particle to stay readily suspended, that has to be comparable to the gravitational potential energy that it would cost to elevate the particle by its own typical size.  For a spherical droplet of the density of water, you'd be looking at something like \((4/3)\pi R^{3} \cdot \rho \cdot g \cdot 2R\), where \(R\) is the droplet radius, \(\rho\) is the density of water, 1000 kg/m3, and \(g\) is the gravitational acceleration, 9.807 m/s2.  Setting those equal and solving gives \(R \approx 470\) nm.  
  • The always excellent Natalie Wolchover has a new article in Quanta about how one limiting factor in gravitational interferometers is the quality of the glass used in the dielectric mirrors.  Specifically, the tunneling two-level systems (see here and here) in ordinary amorphous insulating dielectrics at low temperatures are a problem.  It's like I've said ever since my doctoral work:  tunneling two-level systems are everywhere, and they're evil.
  • As pointed out by many, this paper has a novel approach to room temperature superconductivity.  This is a bit like my idea of converting my entire lab into ultra-high vacuum workspace.  Sure, personnel would all have to wear special spacesuits, but it would really help preserve samples.
  • In these days of social distancing, this was also amusing.
Please stay safe.  I know it's hard to stay positive while all of this is going on, but remember that you're not alone.



Monday, March 30, 2020

Phil Anderson and the end of an era

Social media spread the word yesterday evening that Phil Anderson, intellectual giant of condensed matter physics, had passed away at the age of 96.

It is hard to overstate the impact that Anderson had on the field.  In terms of pure scientific results, there are others far more skilled than I who can describe his contributions, but I will mention a few that are well known:

  • He developed what is now known as the Anderson model, a theoretical treatment originally intended to capture the essential physics in some transition metal-based magnets.  The model considers comparatively localized d orbitals and includes both hopping to neighboring sites in a lattice as well as the "on-site repulsion" U that makes it energetically expensive to have two electrons (in a spin singlet) on the same site.  This leads to "superexchange" processes, where energetically costly double-occupancy is a virtual intermediate state.  The Anderson model became the basis for many developments - allow coupling between the local sites and delocalized s or p bands, and you get the Kondo model.  Put in coupling to lattice vibrations and you get the Anderson-Holstein model.  Have a lattice and make the on-site repulsion really strong, and you get the Hubbard model famed in correlated electron circles and as the favored treatment of the copper oxide superconductors.
  • Anderson also made defining contributions to the theory of localization.  Electrons in solids are wavelike, and in perfect crystal lattices the ones in the conduction and valence bands propagate right past the ions because the waves themselves account for the periodicity of the lattice.  Anderson showed that even in the absence of interactions (the electron-electron repulsion), disorder can scatter those waves, and interference effects can lead to situations where the final result is waves that are exponentially damped with distance.  This is called Anderson localization, and it applies to light and sound as well as electrons.  With strict conditions, this result implies that (ignoring interactions) infinitesimal amounts of disorder can make a 2D electronic system an insulator.  
  • Here is his Nobel Lecture, by the way, that really focuses on these two topics.
  • In considering superconductivity, Anderson also discovered what is now known as the Higgs mechanism, showing that while the bare excitations of some quantum field theory could be massless, coupling those excitations to some scalar field whose particular value broke an underlying symmetry could lead to an effective mass term (in the sense of how momentum and energy relate to each other) for the originally massless degrees of freedom.  Since Anderson himself wrote about this within the last five years, I have nothing to add.
  • Anderson also worked on superfluidity in 3He, advancing understanding of this first-discovered non-electronic paired superfluid and its funky properties due to p-wave pairing.
  • With the discovery of the copper oxide superconductors, Anderson introduced the resonating valence bond (RVB) model that still shapes discussions of these and exotic spin-liquid systems.
Beyond these and other scientific achievements, Anderson famously articulated a key intellectual selling point of condensed matter physics:  emergent properties from collective actions of large numbers of interacting degrees of freedom can be profound, non-obvious, and contain foundational truths - that reductionism isn't always the path to understanding or "fundamental" insights.  More is different.  He also became a vocal critic about the Superconducting Supercollider.  (For what it's worth, while this certainly didn't help collegiality between high energy and condensed matter physics, there were many factors at play in the demise of the SSC.  Anderson didn't somehow single-handedly kill it.)

Anderson was unquestionably a brilliant person who in many ways defined the modern field of condensed matter physics.  He was intellectually active right up to the end, and he will be missed.  (For one of my own interactions with him, see here.)