Thursday, September 10, 2020

The power of a timely collaboration

Sometimes it takes a while to answer a scientific question, and sometimes that answer ends up being a bit unexpected.  Three years ago, I wrote about a paper from our group, where we had found, much to our surprise, that the thermoelectric response of polycrystalline gold wires varied a lot as a function of position within the wire, even though the metal was, by every reasonable definition, a good, electrically homogeneous material.  (We were able to observe this by using a focused laser as a scannable heat source, and measuring the open-circuit photovoltage of the device as a function of the laser position.)  At the time, I wrote "Annealing the wires does change the voltage pattern as well as smoothing it out.  This is a pretty good indicator that the grain boundaries really are important here."

What would be the best way to test the idea that somehow the grain boundaries within the wire were responsible for this effect?  Well, the natural thought experiment would be to do the same measurement in a single crystal gold wire, and then ideally do a measurement in a wire with, say, a single grain boundary in a known location.  

Fig. 4 from this paper
Shortly thereafter, I had the good fortune to be talking with Prof. Jonathan Fan at Stanford.  His group had, in fact, come up with a clever way to create single-crystal gold wires, as shown at right.  Basically they create a wire via lithography, encapsulate it in silicon oxide so that the wire is sitting in its own personal crucible, and then melt/recrystallize the wire.  Moreover, they could build upon that technique as in this paper, and create bicrystals with a single grain boundary.  Focused ion beam could then be used to trim these to the desired width (though in principle that can disturb the surface).

We embarked on a rewarding collaboration that turned out to be a long, complicated path of measuring many many device structures of various shapes, sizes, and dimensions.  My student Charlotte Evans, measuring the photothermoelectric (PTE) response of these, worked closely with members of Prof. Fan's group - Rui Yang grew and prepared devices, and Lucia Gan did many hours of back-scatter electron diffraction measurements and analysis, for comparison with the photovoltage maps.  My student Mahdiyeh Abbasi learned the intricacies of finite element modeling to see what kind of spatial variation of Seebeck coefficient \(S\) would be needed to reproduce the photovoltage maps.  

From Fig. 1 of our new paper.  Panel g upper shows the local crystal
misorientation as found from electron back-scatter diffraction, while 
panel g lower shows a spatial map of the PTE response.  The two 
patterns definitely resemble each other (panel h), and this is seen
consistently across many devices.

A big result of this was published this week in PNAS.  The surprising result:  Individual high-angle grain boundaries produce a PTE signal so small as to be unresolvable in our measurement system.  In contrast, though, the PTE measurement could readily detect tiny changes in Seebeck response that correlate with small local misorientations of the local single crystal structure.  The wire is still a single crystal, but it contains dislocations and disclinations and stacking faults and good old-fashioned strain due to interactions with the surroundings when it crystallized.  Some of these seem to produce detectable changes in thermoelectric response.  When annealed, the PTE features smooth out and reduce in magnitude, as some (but not all) of the structural defects and strain can anneal away.  

So, it turns out it's likely not the grain boundaries that cause Seebeck variations in these nanostructures - instead it's likely residual strain and structural defects from the thin film deposition process, something to watch out for in general for devices made by lithography and thin film processing.  Also, opto-electronic measurements of thermoelectric response are sensitive enough to detect very subtle structural inhomogeneities, an effect that can in principle be leveraged for things like defect detection in manufactured structures.  It took a while to unravel, but it is satisfying to get answers and see the power of the measurement technique.

Tuesday, September 08, 2020

Materials and popular material

This past week was a great one for my institution, as the Robert A. Welch Foundation and Rice University announced the creation of the Welch Institute for Advanced Materials.  Exactly how this is going to take shape and grow is still in the works, but the stated goals of materials-by-design and making Rice and Houston a global destination for advanced materials research are very exciting.  

Long-time readers of this blog know my view that the amazing physics of materials is routinely overlooked in part because materials are ubiquitous - for example, the fact that the Pauli principle in some real sense is what is keeping you from falling through the floor right now.  I'm working on refining a few key concepts/topics that I think are translatable to the general reading public.  Emergence, symmetry, phases of matter, the most important physical law most people have never heard about (the Pauli principle), quasiparticles, the quantum world (going full circle from the apparent onset of the classical to using collective systems to return to quantum degrees of freedom in qubits).   Any big topics I'm leaving out?

Saturday, August 29, 2020

Diamond batteries? Unlikely.

The start of the academic year at Rice has been very time-intensive, leading to the low blogging frequency.  I will be trying to remedy that, and once some of the dust settles I may well create a twitter account to point out as-they-happen results and drive traffic this way.  

In the meantime, there has been quite a bit of media attention this week paid to the claim by NDB that they can make nanodiamond-based batteries with some remarkable properties.  This idea was first put forward in this video.  The eye-popping part of the news release is this:  "And it can scale up to electric vehicle sizes and beyond, offering superb power density in a battery pack that is projected to last as long as 90 years in that application – something that could be pulled out of your old car and put into a new one."

The idea is not a new one.  The NDB gadget is a take on a betavoltaic device.  Take a radioactive source that is a beta emitter - in this case, 14C which decays into 14N plus an antineutrino plus an electron with an average energy of 49 keV - and capture the electrons and ideally the energy from the decay.  Betavoltaic devices produce power for a long time, depending on the half-life of the radioactive species (here, 5700 years).  The problem is, the power of these systems is very low, which greatly limits their utility.  For use in applications when you need higher instantaneous power, the NDB approach appears to be to use the betavoltaic gizmo to trickle-charge an integrated supercapacitor that can support high output powers.

To get a sense of the numbers:  If you had perfectly efficient capture of the decay energy, if you had 14 grams of 14C (a mole), my estimate of the total power available is 13 mW. (((6.02e23 *49000 eV *1.602e-19 J/eV)/2)/(5700 yrs*365.25 days/yr*86400)). If you wanted to charge the equivalent of a full Tesla battery (80 kW-h), it would take (80000 W-hr*3600 s/hr)/(0.013 W) = 2.2e10 seconds. Even if you had 10 kg of pure 14C, that would take you 180 days.

Now, the actual image in the press release-based articles shows a chip-based battery labeled "100 nW", which is very reasonable.  This technology is definitely clever, but it just does not have the average power densities needed for an awful lot of applications.

Tuesday, August 18, 2020

Black Si, protected qubits, razor blades, and a question

The run up to the new academic year has been very time-intense, so unfortunately blogging has correspondingly been slow.  Here are three interesting papers I came across recently:

  • In this paper (just accepted at Phys Rev Lett), the investigators have used micro/nanostructured silicon to make an ultraviolet photodetector with an external quantum efficiency (ratio of number of charges generated to number of incoming photons) greater than 100%.  The trick is carrier multiplication - a sufficiently energetic electron or hole can in principle excite additional carriers through "impact ionization".  In the nano community, it has been argued that nanostructuring can help this, because nm-scale structural features can help fudge (crystal) momentum conservation restrictions in the impact ionization process. Here, however, the investigators show that nanostructuring is irrelevant for the process, and it has more to do with the Si band structure and how it couples to the incident UV radiation.  
  • In this paper (just published in Science), the authors have been able to implement something quite clever that's been talked about for a while.  It's been known since the early days of discussing quantum computing that one can try to engineer a quantum bit that lives in a "decoherence-free subspace" - basically try to set up a situation where your effective two-level quantum system (made from some building blocks coupled together) is much more isolated from the environment than the building blocks themselves individually.  Here they have done this using a particular kind of defect in silicon carbide "dressed" with applied microwave EM fields.  They can increase the coherence time of the composite system by 10000x compared with the bare defect.
  • This paper in Science uses very cool in situ electron microscopy to show how even comparatively soft hairs can dull the sharp edge of steel razor blades.  See this cool video that does a good job explaining this.  Basically, with the proper angle of attack, the hair can torque the heck out of the metal at the very end of the blade, leading to microfracturing and chipping.
And here is my question:  would it be worth joining twitter and tweeting about papers?  I've held off for a long time, for multiple reasons.  With the enormous thinning of science blogs, I do wonder, though, whether I'd reach more people.

Wednesday, August 05, 2020

The energy of the Beirut explosion

The shocking explosion in Beirut yesterday was truly awful and shocking, and my heart goes out to the residents.  It will be quite some time before a full explanation is forthcoming, but it sure sounds like the source was a shipment of explosives-grade ammonium nitrate that had been impounded from a cargo ship and (improperly?) stored for several years.

Interestingly, it is possible in principle to get a good estimate of the total energy yield of the explosion from cell phone video of the event.  The key is a fantastic example of dimensional analysis, a technique somehow more common in an engineering education than in a physics one.  The fact that all of our physical quantities have to be defined by an internally consistent system of units is actually a powerful constraint that we can use in solving problems.  For those interested in the details of this approach, you should start by reading about the Buckingham Pi Theorem.  It seems abstract and its applications seem a bit like art, but it is enormously powerful.  

The case at hand was analyzed by the British physicist G. I. Taylor, who was able to take still photographs in a magazine of the Trinity atomic bomb test and estimate the yield of the bomb.  Assume that a large amount of energy \(E\) is deposited instantly in a tiny volume at time \(t=0\), and this produces a shock wave that expands spherically with some radius \(R(t)\) into the surrounding air of mass density \(\rho\).  If you assume that this contains all the essential physics in the problem, then you can realize that the \(R\) must in general depend on \(t\), \(\rho\), and \(E\).  Now, \(R\) has units of length (meters).  The only way to combine \(t\), \(\rho\), and \(E\) into something with the units of length is \( (E t^2/\rho)^{1/5}\).  That implies that \( R = k (E t^2/\rho)^{1/5} \), where \(k\) is some dimensionless number, probably on the order of 1.  If you cared about precision, you could go and do an experiment:  detonate a known amount of dynamite on a tower and film the whole thing with a high speed camera, and you can experimentally determine \(k\).  I believe that the constant is found to be close to 1.  

Flipping things around and solving, we fine \(E = R^5 \rho/t^2\).  (A more detailed version of this derivation is here.)  

This youtube video is the best one I could find in terms of showing a long-distance view of the explosion with some kind of background scenery for estimating the scale.  Based on the "before" view and the skyline in the background, and a google maps satellite image of the area, I very crudely estimated the radius of the shockwave at about 300 m at \(t = 1\) second.  Using 1.2 kg/m3 for the density of air, that gives an estimated yield of about 3 trillion Joules, or the equivalent of around 0.72 kT of TNT.   That's actually pretty consistent with the idea that there were 2750 tons of ammonium nitrate to start with, though it's probably fortuitous agreement - that radius to the fifth really can push the numbers around.

Dimensional analysis and scaling are very powerful - it's why people are able to do studies in wind tunnels or flow tanks and properly predict what will happen to full-sized aircraft or ships, even without fully understanding the details of all sorts of turbulent fluid flow.  Physicists should learn this stuff (and that's why I stuck it in my textbook.)

Saturday, August 01, 2020

How long does quantum tunneling take?

The "tunneling time" problem has a long, fun history.  Here is a post that I wrote about this issue 13 years ago (!!).  In brief, in quantum mechanics a particle can "tunnel" through a "classically forbidden" region (a region where by simple classical mechanics arguments, the particle does not have sufficient kinetic energy to be there).  I've written about that more recently here, and the wikipedia page is pretty well done.  The question is, how long does a tunneling particle spend in the classically forbidden barrier?  

It turns out that this is not a trivial issue at all.  While that's a perfectly sensible question to ask from the point of view of classical physics, it's not easy to translate that question into the language of quantum mechanics.  In lay terms, a spatial measurement tells you where a particle is, but doesn't say anything about where it was, and without such a measurement there is uncertainty in the initial position and momentum of the particle.  

Some very clever people have thought about how to get at this issue.  This review article by Landauer and Martin caught my attention when I was in grad school, and it explains the issues very clearly.  One idea people had (Baz' and Rybochenko) is to use the particle itself as a clock.  If the tunneling particle has spin, you can prepare the incident particles to have that spin oriented in a particular direction.  Then have a magnetic field confined to the tunneling barrier.  Look at the particles that did tunnel through and see how far the spins have precessed.  This idea is shown below.
"Larmor clock", from this paper

This is a cute idea in theory, but extremely challenging to implement in an experiment.  However, this has now been done by Ramos et al. from the Steinberg group at the University of Toronto, as explained in this very nice Nature paper.  They are able to do this and actually see an effect that Landauer and others had discussed:  there is "back-action", where the presence of the magnetic field itself (essential for the clock) has an effect on the tunneling time.  Tunneling is not instantaneous, though it is faster than the simple "semiclassical" estimate (that one would get by taking the magnitude of the imaginary momentum in the barrier and using that to get an effective velocity).  Very cool.

Saturday, July 25, 2020

Kitchen science: insulated cups

An impromptu science experiment this morning.  A few months ago we acquired some very nice insulated tumblers (initially from causebox and then more from here).  Like all such insulated items, the inner and outer walls are made from a comparatively lousy thermal conductor, in this case stainless steel.  (Steel is an alloy, and the disorder in its micro and nanoscale structure scatters electrons, making it have a lower electrical (and hence thermal) conductivity than pure metals.)  Ideally the walls only touch at the very top lip of the cup where they are joined, and the space between the walls has been evacuated to minimize heat conduction by any trapped gas in there.  When working well, so that heat transfer has to take place along the thin metal wall, the interior wall of the cup tends to sit very close to the temperature of whatever liquid is in there, and the exterior wall tends to sit at room temperature.

We accidentally dropped one of the cups this morning, making a dent near the base.  The question was, did this affect the thermal insulation of that cup?  To test this, we put four ice cubes and four ounces of water from our refrigerator into each cup and let them sit on the counter for 15 minutes.  Then we used an optical kitchen thermometer (with handy diode laser for pointing accuracy) to look at the exterior and interior wall temperatures.  (Apologies for the use of Fahrenheit units.)  Check this out.

The tumbler on the left is clearly doing a better job of keeping the outside warm and the inside cold.  If we then scrutinize the tumbler on the right we find the dent, which must be deep enough to bring the inner and outer walls barely into contact.

The bottom line:  Behold, science works.  Good insulated cups are pretty impressive engineering, but you really should be careful with them, because the layers really are close together and can be damaged.