Search This Blog

Saturday, August 29, 2020

Diamond batteries? Unlikely.

The start of the academic year at Rice has been very time-intensive, leading to the low blogging frequency.  I will be trying to remedy that, and once some of the dust settles I may well create a twitter account to point out as-they-happen results and drive traffic this way.  

In the meantime, there has been quite a bit of media attention this week paid to the claim by NDB that they can make nanodiamond-based batteries with some remarkable properties.  This idea was first put forward in this video.  The eye-popping part of the news release is this:  "And it can scale up to electric vehicle sizes and beyond, offering superb power density in a battery pack that is projected to last as long as 90 years in that application – something that could be pulled out of your old car and put into a new one."

The idea is not a new one.  The NDB gadget is a take on a betavoltaic device.  Take a radioactive source that is a beta emitter - in this case, 14C which decays into 14N plus an antineutrino plus an electron with an average energy of 49 keV - and capture the electrons and ideally the energy from the decay.  Betavoltaic devices produce power for a long time, depending on the half-life of the radioactive species (here, 5700 years).  The problem is, the power of these systems is very low, which greatly limits their utility.  For use in applications when you need higher instantaneous power, the NDB approach appears to be to use the betavoltaic gizmo to trickle-charge an integrated supercapacitor that can support high output powers.

To get a sense of the numbers:  If you had perfectly efficient capture of the decay energy, if you had 14 grams of 14C (a mole), my estimate of the total power available is 13 mW. (((6.02e23 *49000 eV *1.602e-19 J/eV)/2)/(5700 yrs*365.25 days/yr*86400)). If you wanted to charge the equivalent of a full Tesla battery (80 kW-h), it would take (80000 W-hr*3600 s/hr)/(0.013 W) = 2.2e10 seconds. Even if you had 10 kg of pure 14C, that would take you 180 days.

Now, the actual image in the press release-based articles shows a chip-based battery labeled "100 nW", which is very reasonable.  This technology is definitely clever, but it just does not have the average power densities needed for an awful lot of applications.


Tuesday, August 18, 2020

Black Si, protected qubits, razor blades, and a question

The run up to the new academic year has been very time-intense, so unfortunately blogging has correspondingly been slow.  Here are three interesting papers I came across recently:

  • In this paper (just accepted at Phys Rev Lett), the investigators have used micro/nanostructured silicon to make an ultraviolet photodetector with an external quantum efficiency (ratio of number of charges generated to number of incoming photons) greater than 100%.  The trick is carrier multiplication - a sufficiently energetic electron or hole can in principle excite additional carriers through "impact ionization".  In the nano community, it has been argued that nanostructuring can help this, because nm-scale structural features can help fudge (crystal) momentum conservation restrictions in the impact ionization process. Here, however, the investigators show that nanostructuring is irrelevant for the process, and it has more to do with the Si band structure and how it couples to the incident UV radiation.  
  • In this paper (just published in Science), the authors have been able to implement something quite clever that's been talked about for a while.  It's been known since the early days of discussing quantum computing that one can try to engineer a quantum bit that lives in a "decoherence-free subspace" - basically try to set up a situation where your effective two-level quantum system (made from some building blocks coupled together) is much more isolated from the environment than the building blocks themselves individually.  Here they have done this using a particular kind of defect in silicon carbide "dressed" with applied microwave EM fields.  They can increase the coherence time of the composite system by 10000x compared with the bare defect.
  • This paper in Science uses very cool in situ electron microscopy to show how even comparatively soft hairs can dull the sharp edge of steel razor blades.  See this cool video that does a good job explaining this.  Basically, with the proper angle of attack, the hair can torque the heck out of the metal at the very end of the blade, leading to microfracturing and chipping.
And here is my question:  would it be worth joining twitter and tweeting about papers?  I've held off for a long time, for multiple reasons.  With the enormous thinning of science blogs, I do wonder, though, whether I'd reach more people.

Wednesday, August 05, 2020

The energy of the Beirut explosion

The shocking explosion in Beirut yesterday was truly awful and shocking, and my heart goes out to the residents.  It will be quite some time before a full explanation is forthcoming, but it sure sounds like the source was a shipment of explosives-grade ammonium nitrate that had been impounded from a cargo ship and (improperly?) stored for several years.

Interestingly, it is possible in principle to get a good estimate of the total energy yield of the explosion from cell phone video of the event.  The key is a fantastic example of dimensional analysis, a technique somehow more common in an engineering education than in a physics one.  The fact that all of our physical quantities have to be defined by an internally consistent system of units is actually a powerful constraint that we can use in solving problems.  For those interested in the details of this approach, you should start by reading about the Buckingham Pi Theorem.  It seems abstract and its applications seem a bit like art, but it is enormously powerful.  

The case at hand was analyzed by the British physicist G. I. Taylor, who was able to take still photographs in a magazine of the Trinity atomic bomb test and estimate the yield of the bomb.  Assume that a large amount of energy \(E\) is deposited instantly in a tiny volume at time \(t=0\), and this produces a shock wave that expands spherically with some radius \(R(t)\) into the surrounding air of mass density \(\rho\).  If you assume that this contains all the essential physics in the problem, then you can realize that the \(R\) must in general depend on \(t\), \(\rho\), and \(E\).  Now, \(R\) has units of length (meters).  The only way to combine \(t\), \(\rho\), and \(E\) into something with the units of length is \( (E t^2/\rho)^{1/5}\).  That implies that \( R = k (E t^2/\rho)^{1/5} \), where \(k\) is some dimensionless number, probably on the order of 1.  If you cared about precision, you could go and do an experiment:  detonate a known amount of dynamite on a tower and film the whole thing with a high speed camera, and you can experimentally determine \(k\).  I believe that the constant is found to be close to 1.  

Flipping things around and solving, we fine \(E = R^5 \rho/t^2\).  (A more detailed version of this derivation is here.)  

This youtube video is the best one I could find in terms of showing a long-distance view of the explosion with some kind of background scenery for estimating the scale.  Based on the "before" view and the skyline in the background, and a google maps satellite image of the area, I very crudely estimated the radius of the shockwave at about 300 m at \(t = 1\) second.  Using 1.2 kg/m3 for the density of air, that gives an estimated yield of about 3 trillion Joules, or the equivalent of around 0.72 kT of TNT.   That's actually pretty consistent with the idea that there were 2750 tons of ammonium nitrate to start with, though it's probably fortuitous agreement - that radius to the fifth really can push the numbers around.

Dimensional analysis and scaling are very powerful - it's why people are able to do studies in wind tunnels or flow tanks and properly predict what will happen to full-sized aircraft or ships, even without fully understanding the details of all sorts of turbulent fluid flow.  Physicists should learn this stuff (and that's why I stuck it in my textbook.)

Saturday, August 01, 2020

How long does quantum tunneling take?

The "tunneling time" problem has a long, fun history.  Here is a post that I wrote about this issue 13 years ago (!!).  In brief, in quantum mechanics a particle can "tunnel" through a "classically forbidden" region (a region where by simple classical mechanics arguments, the particle does not have sufficient kinetic energy to be there).  I've written about that more recently here, and the wikipedia page is pretty well done.  The question is, how long does a tunneling particle spend in the classically forbidden barrier?  

It turns out that this is not a trivial issue at all.  While that's a perfectly sensible question to ask from the point of view of classical physics, it's not easy to translate that question into the language of quantum mechanics.  In lay terms, a spatial measurement tells you where a particle is, but doesn't say anything about where it was, and without such a measurement there is uncertainty in the initial position and momentum of the particle.  

Some very clever people have thought about how to get at this issue.  This review article by Landauer and Martin caught my attention when I was in grad school, and it explains the issues very clearly.  One idea people had (Baz' and Rybochenko) is to use the particle itself as a clock.  If the tunneling particle has spin, you can prepare the incident particles to have that spin oriented in a particular direction.  Then have a magnetic field confined to the tunneling barrier.  Look at the particles that did tunnel through and see how far the spins have precessed.  This idea is shown below.
"Larmor clock", from this paper

This is a cute idea in theory, but extremely challenging to implement in an experiment.  However, this has now been done by Ramos et al. from the Steinberg group at the University of Toronto, as explained in this very nice Nature paper.  They are able to do this and actually see an effect that Landauer and others had discussed:  there is "back-action", where the presence of the magnetic field itself (essential for the clock) has an effect on the tunneling time.  Tunneling is not instantaneous, though it is faster than the simple "semiclassical" estimate (that one would get by taking the magnitude of the imaginary momentum in the barrier and using that to get an effective velocity).  Very cool.

Saturday, July 25, 2020

Kitchen science: insulated cups

An impromptu science experiment this morning.  A few months ago we acquired some very nice insulated tumblers (initially from causebox and then more from here).  Like all such insulated items, the inner and outer walls are made from a comparatively lousy thermal conductor, in this case stainless steel.  (Steel is an alloy, and the disorder in its micro and nanoscale structure scatters electrons, making it have a lower electrical (and hence thermal) conductivity than pure metals.)  Ideally the walls only touch at the very top lip of the cup where they are joined, and the space between the walls has been evacuated to minimize heat conduction by any trapped gas in there.  When working well, so that heat transfer has to take place along the thin metal wall, the interior wall of the cup tends to sit very close to the temperature of whatever liquid is in there, and the exterior wall tends to sit at room temperature.

We accidentally dropped one of the cups this morning, making a dent near the base.  The question was, did this affect the thermal insulation of that cup?  To test this, we put four ice cubes and four ounces of water from our refrigerator into each cup and let them sit on the counter for 15 minutes.  Then we used an optical kitchen thermometer (with handy diode laser for pointing accuracy) to look at the exterior and interior wall temperatures.  (Apologies for the use of Fahrenheit units.)  Check this out.


The tumbler on the left is clearly doing a better job of keeping the outside warm and the inside cold.  If we then scrutinize the tumbler on the right we find the dent, which must be deep enough to bring the inner and outer walls barely into contact.


The bottom line:  Behold, science works.  Good insulated cups are pretty impressive engineering, but you really should be careful with them, because the layers really are close together and can be damaged.

Thursday, July 23, 2020

Recently in the arxiv - van der Waals interfaces

Sometimes when looking at the pace of results coming out of the 2D material community, I am reminded of an old joke from Tom Lehrer about super-productive people:  "It's people like that who make you realize how little you've accomplished. It's a sobering thought, for example, that, when Mozart was my age, he had been dead for two years." (See here and then listen to the whole album - National Brotherhood Week has particular resonance this year.).

Recently in the arxiv, there were two different back-to-back preprint pairs uploaded by extremely strong collaborations in the trade of creating new condensed matter systems at the interfaces of stacked van der Waals materials (systems like graphene and mica, that can be exfoliated down to atomically thin layers).  

The first pair of papers (and my apologies if I missed others) were this one and this one.  In the former, the investigators take advantage of the energies of the bands in \(\alpha\)-RuCl3, and find that when it is layered stacked with various 2D materials (graphene, bilayer graphene, WSe2, electrons are spontaneously transferred from the 2D materials to the \(\alpha\)-RuCl3 (The normally empty conduction band of \(\alpha\)-RuCl3 lies at lower energy than the top of the valence band of the 2D material.)  This leads to very high hole concentrations within the graphene (etc.), with comparatively minimal disorder, reminiscent of modulation doping, the technique used to achieve outstanding charge mobility in 2D electron and hole gases.  The latter paper is complementary to the former:  the investigators use near-field optical techniques to look at both the plasmon properties of the graphene in such structures, and can back out the optical conductivity of the now-electron-doped \(\alpha\)-RuCl3.

The second pair of papers, this one and this one, show a whole hierarchy of insulating states that appear in moire bilayer structures made from twisted WS2/WSe2 bilayers.  As I've written before, putting together close but not identical lattices and/or twisting one layer relative to another leads to a moire pattern, and therefore superlattice for charge carriers at that interface.  Both groups find (the first using optical methods, the second using microwave techniques) that for a large number of rational fraction ratios between the number of charge carriers and the number of lattice sites, the system is very strongly insulating.  Each insulating state corresponds to a particular periodic arrangement of the charge carriers, trying to stay generally as far away from each other as possible to minimize their potential energy.  These can be analogous to Wigner crystals and charge density waves.

Very cool stuff.

Wednesday, July 22, 2020

APS Division of Condensed Matter Physics invited symposium nominations


While no one knows right now whether the 2021 March Meeting will be in person, online, or some hybrid form next spring, now is the time to put in your nominations for invited speakers and symposia for the Division of Condensed Matter Physics.  The deadline to nominate is August 17.  The whole community benefits from high quality invited talks, so if you're in a position to do this, please think about it.


To Members of the Division of Condensed Matter Physics:

APS is now accepting invited speaker and invited symposium nominations for the March Meeting in 2021. Here is the link to the APS website for submitting nominations.

The Meeting is planned for March 15 to 19, 2021. Join more than 11,000 physicists attending, presenting, and networking at the APS March Meeting 2021. Showcase your work to a global audience of physicists, scientists, and students representing APS units and committees and explore groundbreaking research from industry, academia, and major labs.

Note that the decision regarding a virtual or in-person March Meeting will be made later this summer.

Jim Sauls
DCMP Secretary/Treasurer

Monday, July 13, 2020

Quantum Coffeehouse and other physics videos

Who doesn't need more videos to watch these days?

Erica Carlson has started a new Quantum Coffeehouse video series, including interviews with practicing physicists (including yours truly).  She had also presented a "Great Course", "Understanding the Quantum World".

I'm also a fan of Physics Girl.  I really liked her recent video with supercooled sodium acetate.

Minute Physics is truly outstanding, including their look at N95 masks and how they use electrets to gather and trap polarizable particles.

Andrew Dotson is reliably funny and insightful.

For the musically inclined, acapellascience is engaging, including their particularly timely William Rowan Hamilton.

I also have to plug my colleague Jason Hafner's channel, which netted him an on-screen appearance in the new movie Palm Springs.  Also making an appearance in the movie is Jim Freericks' edx course, Quantum Mechanics for Everyone.




Wednesday, July 08, 2020

Brief items - updated

Some further items of note:
  • There is great anxiety and frustration over the latest pronouncement from DHS/ICE about international students in the US.  Let me give a little context.  For many years there has been a rule that international students studying in the US can take no more than 3 credits (or equivalent) per semester of purely online instruction. The point of that was to prevent many people from applying for F visas and then "studying" at online-only diploma mills while actually working. That is, it was originally a policy meant to encourage that student visas go to legitimate international students and scholars pursuing degrees at accredited universities.  In the spring when the pandemic hit and many universities transitioned to online instruction in the middle of the semester, DHS granted a waiver on this requirement.  Well, now they are trying to rescind that, and are doing so in a particularly draconian way: As written, if a university goes online-only, either from the start of the semester or even partway through due to public health concerns, the international students would face having to leave the US on short notice.   This is a terrible, stupid, short-sighted way to handle this situation, and it doesn't remotely serve the best interests of any constituency (student, university, or country).  Unsurprisingly, many many organizations are pushing back against this.  Hopefully there will be changes and/or workarounds.  UPDATE:  The administration appears to have backed down from this.  Hopefully that will stick.
  • On to science.  Quanta has an article about the origins of the rigidity of glass.  The discussion there is about whether there is a kind of hidden structural order in the glassy material.  Fundamentally (as I've written previously), rigidity in any solid results from a combination of very slow timescales for atomic motion (due to lack of thermal energy available to overcome "barriers") and the Pauli principle giving a hard-core repulsion between atoms.  Still, the question of the underlying nature of glassy systems remains fascinating.
  • The 2D materials experts at Columbia have shown clean fractional quantum Hall physics in a monolayer of WSe<sub>2</sub>.  The actual paper is here.  I have yet to come up with a really nice, generally accessible write-up of the FQH effect. The super short version:  Confine charge carriers in strictly two dimensions, and throw in a large magnetic field perpendicular to the plane (such that the energy associated with cyclotron motion dominates the kinetic energy). At certain ratios of magnetic field to number of charge carriers, the charge carriers can condense into new collective states (generally distinguished by topology rather than broken symmetries like the liquid-gas or nonmagnetic/ferromagnetic phase transitions).  The fractional quantum Hall states can have all sorts of unusual properties, but the key point here is that they are fragile.  Too much disorder (like missing atoms or charged impurities), and the energy associated with that disorder can swamp out the energy savings of condensing into such a state.  It's remarkable that the material quality of the monolayer transition metal dichalcogenide (and its encapsulating boron nitride surroundings) is so high.  Seeing how FQH states evolve in this example new material system with rich band structure should be interesting.
  • I feel bad for only now learning about this great series of talks about the state of the art in spintronics, trying to understand, engineer, and control the motion of spin.
  • For your animal video needs, get the behind-the-scenes story about Olive and Mabel here.

Tuesday, June 30, 2020

How do hot electrons get hot?

We have a paper that came out today that was very fun.  It's been known for a long time that if you apply a sufficiently large voltage \(V\) to a tunnel junction, it is possible to get light emission, as I discussed here a bit over a year ago, and as is shown at the right.  Conventionally, the energy of the emitted photons \(\hbar \omega\) is less than \(eV\) (give or take the thermal energy scale \(k_{\mathrm{B}}T\) ) if the idea is that single-electron processes are all that can happen.  

In this new paper looking at planar metal tunnel junctions, we see several neat things:
  • The emitted spectra look like thermal radiation with some effective temperature for the electrons and holes \(T_{\mathrm{eff}}\), emitted into a device-specific spectral shape and polarization (the density of states for photons doesn't look like that of free space, because the plasmon resonances in the metal modify the emission, an optical antenna effect).   
    Once the effective temperature is taken into account, the raw spectra (left)
    all collapse onto a single shape for a given device.

  • That temperature \(T_{\mathrm{eff}}\) depends linearly on the applied voltage, when looking at a whole big ensemble of devices.  This is different than what others have previously seen.  That temperature, describing a steady-state nonequilibrium tail of the electronic distribution local to the nanoscale gap, can be really high, 2000 K, much higher than that experienced by the atoms in the lattice.
  • In a material with really good plasmonic properties, it is possible to have almost all of the emitted light come out at energies larger than \(eV\) (as in the spectra above).  That doesn't mean we're breaking conservation of energy, but it does mean that the emission process is a multi-electron one.  Basically, at comparatively high currents, a new hot carrier is generated before the energy from the last (or last few) hot carriers has had a chance to leave the vicinity (either by carrier diffusion or dumping energy to the lattice). 
  • We find that the plasmonic properties matter immensely, with the number of photons out per tunneling electron being 10000\(\times\) larger for pure Au (a good plasmonic material) than for Pd (a poor plasmonic material in this enegy range).  
That last point is a major clue.  As we discuss in the paper, we think this implies that plasmons don't just couple the light out efficiently.  Rather, the plasmons also play a key role in generating the hot nonequilibrium carriers themselves.   The idea is that tunneling carriers don't just fly through - they can excite local plasmon modes most of which almost immediately decay into hot electron/hole excitations with energies up to \(eV\) away from the Fermi level.  Hot carriers are potentially useful for a lot of things, including chemistry.  I'm also interested in whether some fun quantum optical effects can take place in these extreme nanoscale light sources.  Lots to do!

Saturday, June 27, 2020

Brief items

Some science items that crossed my path that you may find interesting:
  • This article at Quanta is a nice look at the Ising model for a general audience.  When I took graduate statistical mechanics from Lenny Susskind, he told the story of Lars Onsager just casually mentioning on the middle of a conference talk that Onsager had solved the 2D Ising model exactly.
  • If you have any interest in the modern history of advanced transistors, the special FinFET ones that are now the mainstays of ultrascaled high performance processors, you might find this article to be fun.
  • With all the talk about twisted bilayers of van der Waals materials for exotic electronic properties, it’s cool to see this paper, which looks at the various nonlinear optical processes that can be enabled in similar structures.  Broken structural symmetries are the key to allowing certain nonlinear processes, and the moire plus twist approach is quite the playground.
  • This preprint is very cool, where the authors have made basically an interferometer in the fractional quantum Hall regime for electrons confined in 2D, and can show clean results that demonstrate nontrivial statistics.  The aspect of this that I think is hard for non-experimentalists to appreciate is how challenging it is to create a device like this that is so clean - the fractional quantum Hall states are delicate, and it is an art form to create devices to manipulate them without disorder or other problems swamping what you want to measure.
Coming at some point, a post or two about my own research.

Wednesday, June 24, 2020

A nation of immigrants

Real life has been intruding rudely on my blogging time.  I will try to step up, but nothing seems to be slowing down this summer.

I sense from the comments on my last post that there is some demand to talk about US immigration policy as it pertains to the scientific community (undergraduate and graduate students, postdocs, scholars, faculty members).  I've been doing what little I can to try to push back against what's going on.  I think the US has benefited enormously from being a training destination for many of the world's scientists and engineers - the positive returns to the country overall and the economy have been almost unquantifiably large.  Current policies seem to me to be completely self-defeating.  As I wrote over three years ago alluding to budget cuts (which thankfully Congress never implemented), there is hysteresis and an entropic component in policy-making.  It's depressingly easy to break things that can be very difficult to repair.  Using immigration policy to push away the world's scientists and engineers from the US is a terrible mistake that runs the risk of decades of long-term negative consequences.

Monday, June 15, 2020

The foil electret microphone

Pivoting back toward science by way of technology.... Some very large fraction of the microphones out there in electronic gadgets are based on electrets.  An electret is an insulating material with a locked-in electrical polarization - for example, take a molten or solvated polymer, embed highly polar molecules in there, and solidify in the presence of a large polarizing electric field.  The electrical polarization means that there is an effective surface charge density.  You can make that electret into a free-standing foil or a film coating a backing to make a diaphragm.  When that film vibrates, it will generate an oscillating voltage on a nearby electrode (which could, say, be the gate electrode of a field-effect transistor).  Voila - a microphone that is simple, readily manufacturable, and doesn't need an external power supply.  

While electret microphones are losing some marketshare to microelectromechanical ones in things like airpods, they've played a huge part in now ubiquitous phone and acoustic technologies in the late 20th and early 21st centuries.  When I was a postdoc I was fortunate one day to meet their coinventor, James West, who was still at Bell Labs, when (if I recall correctly) his summer student gave a presentation on some lead-free ultra-adhesive solder they were working on.  He was still patenting inventions within the last two years, in his late 80s - impressive!

Monday, June 08, 2020

Change is a depressingly long time in coming.

People don't read this blog for moralizing, and I surely don't have any particular standing, but staying silent out of concern for saying the wrong thing isn't tenable.  Black lives matter.  There is no more stark reminder of the depressingly long timescales for social progress than the long shadow cast by the US history of slavery.  I have to hope that together we can make lasting change - the scale of the outpouring in the last week has to be a positive sign.  The AAAS announced that on Wednesday June 10 they will be "observing #shutdownSTEM, listening to members of our community who are sharing resources and discussing ways to eliminate racism and make STEM more inclusive of Black people. www.shutdownstem.com. We encourage you to join us."  It's a start.

Wednesday, June 03, 2020

Non-academic careers and physics PhDs

With so many large-scale events happening right now (the pandemic, resulting economic displacement, the awful killing of George Floyd and resulting protests and unrest, federal moves regarding international students), it's hard not to feel like blogging is a comparatively self-indulgent activity.  Still, it is a way to try to restore a feeling of normalcy.  

The Pizza Perusing Physicist had asked, in this comment, if I could offer any guidance about non-academic careers for physics PhDs (including specific fields and career paths), beyond cliches about how PhD skills are valued by many employers.  I don't have any enormous font of wisdom on which to draw, but I do have a few points:
  • I do strongly recommend reading A PhD is Not Enough.  It's a bit older now, but has good insights.
  • It is interesting to look at statistics on where people actually land.  According to the AIP, about a half of physics PhDs take initial academic jobs (postdocs and others); a third go to the private sector; and 14% go to government positions.  Similarly, you can see the skills that recent PhDs say they use in their jobs.  
  • I found it particularly interesting to read the comments from people ten years out from their degrees, since they have some greater perspective - seriously, check out that document.
  • Those latter two AIP documents show why "PhD skills are valued by employers" has become cliched - it's true.
  • In terms of non-academic career options for physics PhDs, there really are a wide variety, though like any career trajectory a great deal depends on the skills, flexibility, and foresight of the person.  Technical problem solving is a skill that a PhD should have learned - how to break big problems up into smaller ones, how to consider alternatives and come up with ways to test those, etc.  There is a often a blurry line between physics and some types of engineering, and it is not uncommon for physics doctorates to get jobs at companies that design and manufacture stuff - as a condensed matter person, I have known people who have gone to work at places like Intel, Motorola, Seagate, National Instruments, Keysight (formerly Agilent), Northrup Grumman, Lockheed, Boeing, etc.  It is true that it can be hard to get your foot in the door and even know what options are available.  I wish I had some silver bullet on this, but your best bets are research (into job openings), networking, and career fairs including at professional conferences.  Startups are also a possibility, though those come with their own risks.  Bear in mind that your detailed technical knowledge might not be what companies are looking for - I have seen experimentalist doctoral students go be very successful doing large-scale data analysis for oil services firms, for example.  Likewise, many people in the bioengineering and medical instrumentation fields have physics backgrounds.
  • If academia isn't for you, start looking around on the side early on.  Get an idea of the choices and a feel for what interests you. 
  • Make sure you're acquiring skills as well as getting your research done.  Learning how to program, how to manipulate and analyze large data sets, statistical methods - these are generally useful, even if the specific techniques evolve rapidly.
  • Communication at all levels is a skill - work at it.  Get practice writing, from very short documents (summarize your research in 150 words so that a non-expert can get a sense of it) to papers to the thesis.  Being able to write and explain yourself is essential in any high level career.  Get practice speaking with comfort, from presentations to more informal 1-on-1 interactions.  Stage presence is a skill, meaning it can be learned.  
  • Don't discount think tanks/analysis firms/patent firms - people who can tell the difference between reality and creative marketing language (whether about products or policies) are greatly valued.
  • Similarly, don't discount public policy or public service.  The fraction of technically skilled people in elected office in the US is woefully small (while the chancellor of Germany has a PhD in quantum chemistry).  These days, governing and policy making would absolutely benefit from an infusion of people who actually know what technology is and how it works, and can tell the difference between an actual study and a press release.
I'm sure more things will occur to me after I publish this.  There is no one-size-fits-all answer, but that's probably a good thing.

Wednesday, May 27, 2020

The National Science and Technology Foundation?

A proposal is being put in front of Congress that would reshape the National Science Foundation into the National Science and Technology Foundation.  The Senate bill is here, and the House equivalent bill is here.  The actual text of the Senate bill is here in pdf form.   In a nutshell, this "Endless Frontiers" bill (so named to echo the Vannevar Bush report that spurred the creation of the NSF in the first place) would do several things, including:
  • Create a Technology Directorate with its own advisory board (distinct from the National Science Board)
  • Would identify ten key technology areas (enumerated in the bill, initially (i) artificial intelligence and machine learning; (ii) high performance computing, semiconductors, and advanced computer hardware; (iii) quantum computing and information systems; (iv) robotics, automation, and advanced manufacturing; (v) natural or anthropogenic disaster prevention; (vi) advanced communications technology; (vii) biotechnology, genomics, and synthetic biology; (viii) cybersecurity, data storage, and data management technologies; (ix) advanced energy; and (x) materials science, engineering, and exploration relevant to the other key technology focus areas)
  • Would have funds allocated by program managers who may use peer review in an advisory role (so, more like DOD than traditional NSF)
  • Invest $100B over 5 years, with the idea that the rest of NSF would also go up, but this new directorate would get the large bulk of the funding
This article at Science does a good job outlining all of this.  The argument is, basically, that the US is lagging in key areas and is not doing a good job translating basic science into technologies that ensure international primacy (with China being the chief perceived rival, though this is unstated in the bills of course).  If this came to pass, and it's a big "if", this could fundamentally alter the character and mission of the NSF.  Seeing bipartisan congressional enthusiasm for boosting funding to the NSF is encouraging, but I think there are real hazards in pushing funding even farther toward applications, particularly in a governance and funding-decision model that would look so different than traditional NSF.  

It's worth noting that people have been having these arguments for a long time.  Here is a 1980 (!) article from Science back when a "National Technology Foundation" proposal was pending before Congress, for exactly the same perceived reasons (poor translation of basic science into technology and business competitiveness, though the Soviets were presumably the rivals with whom people were concerned about competing).  The NSF has their own history that mentions this, and how this tension led to the creation of the modern Engineering Directorate within NSF.  

Interesting times.  Odds are this won't pass, but it's a sign of bipartisan concern about the US falling behind its technological rivals.

Wednesday, May 20, 2020

Yet more brief items

Between writing deadlines, battling with reviewer 3 (I kid, I kid), and trying to get set for the tentative beginnings of restarting on-campus research, it's been a busy time.  I really do hope to do more blogging soon (suggested topics are always appreciated), but for now, here are a few more brief items:
  • This expression of editorial concern about this paper was an unwelcome surprise.  Hopefully all will become clear.  Here is a statement by the quantum information science-related center at Delft.
  • I happened across this press release, pointing out that nVidia's new chip will contain 54 billion transistors (!) fabbed with a 7 nm process.  For reference, the "7 nm" there is a label describing particular fabrication processes using finFETs, and doesn't really correspond to a physical feature size of 7 nm.  I discussed this here before.  Still impressive.
  • There is a lot of talk about moving cutting-edge semiconductor fabrication plants back to the US.  Intel and parts of GlobalFoundries aside, a large fraction of high end chip volume is produced outside the US.  There have long been national security and intellectual property concerns about the overseas manufacturing of key technologies, and the US DOD has decided that bringing some of this capability back on-shore is safer and more secure.  I'm surprised it's taken this long, though the enormous capital cost in setting up a foundry explains why these things are often done by large consortia.  The pandemic has also shown that depending on overseas suppliers for just-in-time delivery of things may not be the smartest move.
  • Speaking of that, I can't help but wonder about the cycle of unintended consequences that we have in our economic choices.  I've ranted (way) before about how the way the stock market and corporate governance function these days has basically squeezed away most industrial basic research.  Those same attitudes gave us "just-in-time" manufacturing and somehow convinced generations of corporate management that simple things like warehouses and stockrooms were inherently bad.  "Why keep a stockroom around, when you can always order a M5 allen head bolt via the internet and get it shipped overnight from hundreds or thousands of miles away?" runs the argument, the same kind of bogus accounting that implies that the continued existence of a space in the Bell Labs parking lot used to cost Lucent $30K/yr.   So, companies got rid of inventory, got rid of local suppliers, and then were smacked hard by the double-whammy of a US-China trade war and a global pandemic.  Now we are being bombarded with breathless stories about how the pandemic and people working from home might mean the complete delocalization of work - a vision of people working from anywhere, especially places more financially sustainable than the Bay Area.  I'm all for telecommuting when it makes sense, and minimizing environmental impact, and affordable places to live.  That being said, it's hard not to feel like a truly extreme adoption of this idea is risky.  What if, heaven forbid, there's a big disruption to the communications grid, such as a Carrington Event?  Wouldn't that basically obliterate the ability of completely delocalized companies to function?  
  • To end on a much lighter note, these videos (1, 2, 3, 4) have been a positive product of the present circumstances, bringing enjoyment to millions.

Sunday, May 10, 2020

Brief items

Apologies for the slowed frequency of posting.  Academic and research duties have been eating a lot of bandwidth.  Here are a few items that may be of interest:

  • This article about Shoucheng Zhang is informative, but at the same time very sad.  Any geopolitics aside, he was an intense, driven person who put enormous pressure on himself.  It says something about self-perception under depression that he was concerned that he was somehow not being recognized.  
  • This paper caught my eye.  If you want to see whether there is some dependence of electronic conduction on the relative directions of a material's crystal axes and the current, it makes sense to fabricate a series of devices oriented in different directions.  These authors take a single epitaxial film of a material (in this case the unconventional superconductor Sr2RuO4) and carve it into a radial array of differently oriented strips of material with measurement electrodes.   They find that there do seem to be "easy" and "hard" directions for transport in the normal state that don't have an obvious relationship to the crystal symmetry directions.  A similar approach was taken here in a cuprate superconductor.  
  • I like the idea of making characterization tools broadly available for low cost - it's great for the developing world and potentially for use in public secondary education.  This work shows plans for a readily producible optical microscope that can have digital imaging, motorized sample positioning, and focusing for a couple of hundred dollars.  Fancier than the foldscope, but still very cool.  Time to think more about how someone could make a $100 electron microscope....
  • Here is a nice review article from the beginning of the year about spin liquids.
  • I was going to point out this article about ultralow temperature nanoelectronics back in March, but the pandemic distracted me.  From grad school I have a history in this area, and the progress is nice to see.  The technical challenges of truly getting electrons cold are formidable.

Thursday, April 30, 2020

On the flexural rigidity of a slice of pizza

People who eat pizza (not the deep dish casserole style from Chicago, but normal pizza), unbeknownst to most of them, have developed an intuition for a key concept in elasticity and solid mechanics. 

I hope that all right-thinking people agree that pizza slice droop (left hand image) is a problem to be avoided.  Cheese, sauce, and toppings are all in serious danger of sliding off the slice and into the diner's lap if the tip of the slice flops down.  Why does the slice tend to droop?   If you hold the edge of the crust and try to "cantilever" the slice out into space, the weight of the sauce/toppings exerts downward force, and therefore a torque that tries to droop the crust.  

A simple way to avoid this problem is shown in the right-hand image (shamelessly stolen from here).  By bending the pizza slice, with a radius of curvature around an axis that runs from the crust to the slice tip, the same pizza slice becomes much stiffer against bending.   Why does this work?  Despite what the Perimeter Institute says here, I really don't think that differential geometry has much to do with this problem, except in the sense that there are constraints on what the crust can do if its volume is approximately conserved.  

The reason the curved pizza slice is stiffer turns out to be the same reason that an I-beam is stiffer than a square rod of the same cross-sectional area.  Imagine an I-beam with a heavy weight (its own, for example) that would tend to make it droop.  In drooping a tiny bit, the top of the I-beam would get stretched out, elongated along the \(z\) direction - it would be in tension.  The bottom of the I-beam would get squeezed, contracted along the \(z\) direction - it would be in compression.  Somewhere in the middle, the "neutral axis", the material would be neither stretched nor squeezed.  We can pick coordinates such that the line \(y=0\) is the neutral axis, and in the linear limit, the amount of stretching (strain) at a distance \(y\) away from the neutral axis would just be proportional to \(y\).  In the world of linear elasticity, the amount of restoring force per unit area ("normal stress") exhibited by the material is directly proportional to the amount of strain, so the normal stress \(\sigma_{zz} \propto y\).  If we add up all the little contributions of patches of area \(\mathrm{d}A\) to the restoring torque around the neutral axis, we get something proportional to \(\int y^2 \mathrm{d}A\).  The bottom line:  All other things being equal, "beams" with cross-sectional area far away from the neutral axis resist bending torques more than beams with area close to the neutral axis.

Now think of the pizza slice as a beam.  (We will approximate the pizza crust as a homogeneous elastic solid - not crazy, though really it's some kind of mechanical metamaterial carbohydrate foam.)  When the pizza slice is flat, the farthest that some constituent bit of crust can be from the neutral axis is half the thickness of the crust.  When the pizza slice is curved, however, much more of its area is farther from the neutral axis - the curved slice will then resist bending much better, even made from the same thickness of starchy goodness as the flat slice.

(Behold the benefits of my engineering education.) 

Wednesday, April 22, 2020

Brief items

A few more links to help fill the time:
  • Steve Simon at Oxford has put his graduate solid state course lectures online on youtube, linked from here.  I'd previously linked to his undergrad solid state lectures.   Good stuff, and often containing fun historical anecdotes that I hadn't known before.
  • Nature last week had this paper demonstrating operations of Si quantum dot-based qubits at 1.5 K with some decent fidelity.  Neat, showing that long electron spin coherence times are indeed realizable in these structures at comparatively torrid conditions.
  • Speaking of quantum computing, it was reported that John Martinis is stepping down as lead of google's superconducting quantum computation effort (these folks).  I've always thought of him as an absolutely fearless experimentalist, and while no one is indispensable, his departure leads me to lower my expectations about google's progress.  Update:  Forbes has a detailed interview with Martinis about this.  It's a very interesting inside look.  
  • I'd never heard of "the Poynting effect" before, and I thought this write-up was very nice.