Update: For what it's worth, Vox has an article about Harvey, and they say it deposited 14-15 trillion gallons of water. Each gallon is 3.78 kg, meaning that the total mass of water deposited for 14 trillion gallons is 5.3e10 metric tons. How's that for estimating accuracy in the above?
A blog about condensed matter and nanoscale physics. Why should high energy and astro folks have all the fun?
Friday, August 25, 2017
Hurricanes, heat engines, etc.
Looks like it's going to be a wet few days, with the arrival of Harvey. I've mentioned previously that hurricanes and tropical storm systems are heat engines - they basically use the temperature difference between the heated water in the ocean and the cooler air in the upper atmosphere to drive enormous flows of matter (air currents, water in vapor and liquid form). A great explanation of how this works is here. Even with very crude calculations one can see that the power involved in a relatively small tropical rain event is thousands GW, hundreds of times greater than the power demands of a major city. Scaling up to a hurricane, you arrive at truly astonishing numbers. It's likely that Harvey is churning along at an average power some 200 times greater than the electrical generating capacity of the planet (!). Conservative predictions right now are for total rainfall of maybe 40 cm across an area the size of the state of Louisiana, which would be a total amount of 5.2e10 metric tons of water. Amazing. I'm planning to write more in the future about some of this, time permitting.
Update: For what it's worth, Vox has an article about Harvey, and they say it deposited 14-15 trillion gallons of water. Each gallon is 3.78 kg, meaning that the total mass of water deposited for 14 trillion gallons is 5.3e10 metric tons. How's that for estimating accuracy in the above?
Update: For what it's worth, Vox has an article about Harvey, and they say it deposited 14-15 trillion gallons of water. Each gallon is 3.78 kg, meaning that the total mass of water deposited for 14 trillion gallons is 5.3e10 metric tons. How's that for estimating accuracy in the above?
Friday, August 18, 2017
Invited symposia/speaker deadlines, APS 2018
For those readers who are APS members, a reminder. The deadline for nominations for invited symposia for the Division of Condensed Matter Physics for the 2018 March Meeting is August 31. See here.
Likewise, the Division of Materials Physics has their invited speaker (w/in focus topic) deadline of August 29. See here.
Please nominate!
Power output of a lightsaber
It's been a long week and lots of seriously bad things have been in the news. I intend to briefly distract myself by looking at that long-standing question, what is the power output of a lightsaber? I'm talking about the power output when the lightsaber is actually slicing through something, not just looking cool. We can get a very rough, conservative estimate from the documentary video evidence before us. I choose not to use the prequels, on general principle, though the scene in The Phantom Menace when Gui-Gon Jinn cuts through the blast door would be a good place to start. Instead, let's look at The Force Awakens, where Kylo Ren throws a tantrum and slices up an instrument panel, leaving behind dripping molten metal.
With each low-effort swing of his arm, Ren's lightsaber, with a diameter of around 2 cm moves at something more than 2 m/s, slicing metal to a depth of, say, 3 cm (actually probably deeper than that). That is, the cutting part of the blade is sweeping out a volume of around 1200 cc/sec. It is heating that volume of console material up to well above its melting point, so we need to worry about the energy it takes to heat the solid from room temperature (300 K) up to its melting point, and then the heat of fusion required to melt the material. At a rough guess, suppose imperial construction is aluminum. Aluminum has a specific heat of 0.9 J/g-K, and a density of 2.7 g/cc when solid, a melting point of 933 K, and a heat of fusion of 10.7 kJ/mol. In terms of volume, that's (10.7 kJ/mol)(1 mol/27 g)(2.7 g/cc) = 1070 J/cc. So, the total power is around (933K-300K)*(0.9J/g-K)*(2.7g/cc)*(1200 cc/sec) + (1070 J/cc)(1200 cc/sec) = 3.1e6 J/s = 3.1 MW.
Hot stuff.
Thursday, August 10, 2017
That's the way the ball bounces.
How does a ball bounce? Why does a ball, dropped from some height onto a flat surface, not bounce all the way back up to its starting height? The answers to these questions may seem obvious, but earlier this week, this paper appeared on the arxiv, and it does a great job of showing what we still don't understand about this everyday physics that is directly relevant for a huge number of sports.
The paper talks specifically about hollow or inflated balls. When a ball is instantaneously at rest, mid-bounce, it's shape has been deformed by its interaction with the flat surface. The kinetic energy of its motion has been converted into potential energy, tied up in a combination of the elastic deformation of the skin or shell of the ball and the compression of the gas inside the ball. (One surprising thing I learned from that paper is that high speed photography shows that the non-impacting parts of such inflated balls tend to remain spherical, even as part of the ball deforms flat against the surface.) That gas compression is quick enough that heat transfer between the gas and the ball is probably negligible. A real ball does not bounce back to its full height; equivalently, the ratio \(v_{f}/v_{i}\) of the ball's speed immediately after the bounce, \(v_{f}\), to that immediately before the bounce, \(v_{i}\), is less than one. That ratio is called the coefficient of restitution.
Somehow in the bounce process some energy must've been lost from the macroscopic motion of the ball, and since we know energy is conserved, that energy must eventually show up as disorganized, microscopic energy of jiggling atoms that we colloquially call heat. How can this happen?
The paper talks specifically about hollow or inflated balls. When a ball is instantaneously at rest, mid-bounce, it's shape has been deformed by its interaction with the flat surface. The kinetic energy of its motion has been converted into potential energy, tied up in a combination of the elastic deformation of the skin or shell of the ball and the compression of the gas inside the ball. (One surprising thing I learned from that paper is that high speed photography shows that the non-impacting parts of such inflated balls tend to remain spherical, even as part of the ball deforms flat against the surface.) That gas compression is quick enough that heat transfer between the gas and the ball is probably negligible. A real ball does not bounce back to its full height; equivalently, the ratio \(v_{f}/v_{i}\) of the ball's speed immediately after the bounce, \(v_{f}\), to that immediately before the bounce, \(v_{i}\), is less than one. That ratio is called the coefficient of restitution.
Somehow in the bounce process some energy must've been lost from the macroscopic motion of the ball, and since we know energy is conserved, that energy must eventually show up as disorganized, microscopic energy of jiggling atoms that we colloquially call heat. How can this happen?
- The skin of the ball might not be perfectly elastic - there could be some "viscous losses" or "internal friction" as the skin deforms.
- As the ball impacts the surface, it can launch sound waves into the surface that eventually dissipate.
- Similarly, the skin of the ball itself can start vibrating in a complicated way, eventually damping out to disorganized jiggling of the atoms.
- As the ball's skin hits the ground and deforms, it squeezes air out from beneath the ball; the speed of that air can actually exceed the speed of sound in the surrounding medium (!), creating a shock wave that dissipates by heating the air, as well as ordinary sound vibrations. (It turns out that clapping your hands can also create shock waves! See here and here.)
- There can also be irreversible acoustic process in the gas inside the ball that heat the gas in there.
Saturday, August 05, 2017
Highlights from Telluride
Here are a few highlights from the workshop I mentioned. I'll amend this over the next couple of days as I have time. There is no question that smaller meetings (this one was about 28 people) can be very good for discussions.
- I learned that there is a new edition of Cuevas and Scheer that I should pick up. (The authors are Juan Carlos Cuevas and Elke Scheer, a great theorist/experimentalist team-up.)
- Apparently it's possible to make a guitar amplifier using tunnel junctions made from self-assembled monolayers. For more detail, see here.
- Some folks at Aachen have gotten serious about physics lab experiments you can do with your mobile phone.
- Richard Berndt gave a very nice talk about light emission from atomic-scale junctions made with a scanning tunneling microscope. Some of that work has been written about here and here. A key question is, when a bias of \(eV\) is applied to such a junction, what is the mechanism that leads to the emission of photons of energies \(\hbar \omega > eV\)? Clearly the processes involve multiple electrons, but exactly how things work is quite complicated, involving both the plasmonic/optical resonances of the junction and the scattering of electrons at the atomic-scale region. Two relevant theory papers are here and here.
- Latha Venkataraman showed some intriguing new results indicating room temperature Coulomb blockade-like transport in nanoclusters. (It's not strictly Coulomb blockade, since the dominant energy scale seems to be set by single-particle level spacing rather than by the electrostatic charging energy of changing the electronic population by one electron).
- Katharina Franke showed some very pretty data on single porphyrins measured via scanning tunneling microscope, as in here. Interactions between the tip and the top of the molecule result in mechanical deformation of the molecule, which in turn tunes the electronic coupling between the transition metal in the middle of the porphyrin and the substrate. This ends up being a nice system for tunable studies of Kondo physics.
- Uri Peskin explained some interesting recent results that were just the beginning of some discussions about what kind of photoelectric responses one can see in very small junctions. One recurring challenge: multiple mechanisms that seem to be rather different physics can lead to similar experimentally measurable outcomes (currents, voltages).
- Jascha Repp discussed some really interesting experiments combining STM and THz optics, to do true time-resolved measurements in the STM, such as watching a molecule bounce up and down on a metal surface (!). This result is timely (no pun intended), as this remarkable paper just appeared on the arxiv, looking at on-chip ways of doing THz and faster electronics.
- Jeff Neaton spoke about the ongoing challenge of using techniques like density functional theory to calculate and predict the energy level alignment between molecules and surfaces to which they're adsorbed or bonded. This is important for transport, but also for catalysis and surface chemistry broadly. A relevant recent result is here.
- Jan van Ruitenbeek talked about their latest approach to measuring shot noise spectra in atomically small structures up to a few MHz, and some interesting things that this technique has revealed to them at high bias.
- There were multiple theory talks looking at trying to understand transport, inelastic processes, and dissipation in open, driven quantum systems. Examples include situations where higher driving biases can actually make cooling processes more efficient; whether it's possible to have experiments in condensed matter systems that "see" many-body localization, an effect most explored in cold atom systems; using ballistic effects in graphene to do unusual imaging experiments or make electronic "beam splitters"; open systems from a quantum information point of view; what we mean by local effective temperature on very small scales; and new techniques for transport calculations.
- Pramod Reddy gave a really nice presentation about his group's extremely impressive work measuring thermal conduction at the atomic scale. Directly related, he also talked about the challenges of measuring radiative heat transfer down to nm separations, where the Stefan-Boltzmann approach should be supplanted by near-field physics. This was a very convincing lesson in how difficult it is to ensure that surfaces are truly clean, even in ultrahigh vacuum.
- Joe Subotnik's talk about electronic friction was particularly striking to me, as I'd been previously unaware of some of the critical experiments (1, 2). When and how do electron-hole excitations in metals lead to big changes in vibrational energy content of molecules, and how to think about this. These issues are related to these experiments as well.
- Ron Naaman spoke about chiral molecules and how electron transfer to and from these objects can have surprising, big effects (see here and here).
- Gemma Solomon closed out the proceedings with a very interesting talk about whether molecules could be used to make effective insulating layers better at resisting tunneling current than actual vacuum, and a great summary of the whole research area, where it's been, and where it's going.
Thursday, August 03, 2017
Workshop on quantum transport
Blogging has been slow b/c of travel. I'm attending a workshop on "Quantum transport in nanoscale molecular systems". This is rather like a Gordon Conference, with a fair bit of unpublished work being presented, but when it's over I'll hit a few highlights that are already in the literature. Update: here you go.