Search This Blog

Thursday, May 28, 2015

Fun with fluids: Hydraulic jump

We usually think of shock waves as exotic - something that happens when a huge explosion goes off, or when a supersonic plane flies by.  A shock in a gas is a relatively abrupt boundary between relatively cold gas moving faster than the speed of sound in the gas (that is, with a Mach number \(M \equiv v/c_{s} > 1\), where \(c_{s}\) is the sound speed and \(v\) is the speed of the gas) and warmer gas moving slower than sound (\(M < 1\)).  A shock that moves on its own relative to a stationary environment is a shock wave, while one that remains fixed in place relative to its surroundings is a "standing shock".   The details of the gas motion within the shock itself are very complicated, but the constraints of mass and momentum conservation make it possible to understand a lot about the relationship between upstream and downstream gas conditions even without knowing the nitty gritty.
  
It turns out that you have very likely seen a fluid analog of a standing shock in your sink!   Run the tap so that a stream of water hits the flat bottom of a typical kitchen sink.  You will see a disk-shaped region with a radius of a few cm (depending on flow rate) where the water is fast-moving but thin, surrounded by a turbulent ring, outside of which the water layer is thicker but slower-moving.   This is called a hydraulic jump.   The fast-moving water has a speed \(v\) greater than the speed of gravity-driven ripples in a thin fluid layer, \(\sqrt{g h}\), where \(g\) is the gravitational acceleration and \(h\) is the fluid depth.  The Froude number \(Fr \equiv v/\sqrt{gh}\) is greater than one on the fast-moving side of the jump, and less than one on the slow moving side.  Like the gas shock case, the boundary is a mess, but mass and momentum conservation can let you calculate the flow speed and fluid depth downstream if you know the flow speed and fluid depth upstream.   

The receding floodwaters in my neighborhood Tuesday provided me with a great example of a hydraulic jump, shown in the brief video clip above.  I still think it's cool that you can see an analog of a sonic boom in your sink, or in a nearby street if you're unlucky.

Tuesday, May 26, 2015

Storms are powerful heat engines!

Storms are incredibly powerful (on human scales) heat engines - driven by the sun, the fluxes of mass and energy are simply enormous.  To see just how big, let's take a look at the big thunderstorm system that flooded large parts of the city of Houston last night (hence why I'm blogging and unable to get to campus until the waters recede).   A large storm system dumped about 20 cm of rain (!) over a land area of approximately 4000 km2 between 10:00pm and about 4:00am.   That's 8 x 1011 kg of water!

The bottom of the rain clouds was maybe 0.5 km above ground level.  That's a lower bound on how far all that rain had to fall.  Using \(g \approx \)10 m/s2, that's about 4 x 1015 J of energy, deposited in about 20000 sec, for an average power delivered of 2 x 1011 Watts, as much as 100 municipal-scale power stations.   That doesn't even account for the energy contained in the wind and the lightning discharges.

Remember, this is all being driven by the sun, through temperature differences that are at most 20 K.  Thermodynamics tells us that the most efficient this process could possibly be is something like 1 - (300 K/320 K) = 1/16.  That means that the total energy involved had to be at least 6.4 x 1016 J = about 18 billion kW-h, and that's only one part of a big storm system.  This is why engineering the weather is a non-starter!

Monday, May 25, 2015

What is band theory? (car analogy)

One of the commenters on my previous post asked how I could explain band theory to a nonscientist, an artist in particular.  Here's a shot.  By necessity, when trying to give an explanation that avoids math almost completely, I'm forced to lean heavily on analogy, which means sacrificing accuracy to some degree.  Still, I think this is a useful exercise - it certainly makes me think hard about what I consider to be the important elements of a concept.  (This must be what Randall Munroe had to do many times for his upcoming book!  If you haven't read his first one, what have you been waiting for?)

The electronic properties of many crystalline materials are well described by "band theory".  At its heart, band theory comes down to three important ideas that I'll explain more in a minute:  Electrons in solids can only have certain states (to be defined below); those states are determined by the arrangement of the atoms in the solid; and each state can only hold two electrons, no more.   To describe this, I'm going to have to mix metaphors a bit, but bear with me.

In a very American mode, we're going to picture the electronic states as individual lanes in a verrrrrry wide, multi-lane highway.  Each lane has a different speed limit (each state has a particular kinetic energy), with the slowest traffic off to the driver's right (in this US-centric analogy) and speed limits increasing progressively to the driver's left.   Each lane can only hold at most two cars (each state can only hold two electrons, one of each kind of "spin").   Here's where the analogy becomes more of a reach:  Not all speed limits (electron kinetic energies) are allowed.  Speeds in adjacent lanes are separated by a small amount (energy level spacings are set by the size of the crystal), and some lanes are missing altogether (some energies are outright forbidden, determined by the type and arrangement of atoms in the crystal).  So, there are "bands" of lanes, separated from each other by "gaps".

Now we start adding cars to the highway with the restriction that cars can only drive at the speed limit of their lane, and (in an un-American twist) the drivers want to go the minimum possible speed.  This is going to tell us the "ground state", the slowest/lowest energy configuration of the system.  The first two cars (electrons) go into the slowest lanes waaaaay over on the driver's right.  The next two cars go into the second-slowest lane, and so forth.  We keep adding in cars (electrons) until we run out of inventory (until we have kept track of all of the electrons).  The more cars we put in, the faster the top speed of the fastest cars!

Cars can only merge into lanes that are open (or only partly occupied).  If the last car added ends up in a lane in the middle of a band of lanes, so that it can easily merge into an adjacent unoccupied lane, this situation corresponds to a material that is a metal.  If the last car ends up right against the guard rail of a band of lanes, so that there just is no adjacent lane to the driver's left available, then this situation corresponds to a "band insulator".   (If the gap to the next band of lanes is large, we call such materials "insulators"; if it's not too big, we call those materials "semiconductors".)

One point that even this very imperfect analogy can highlight:  The speed of the fastest cars (electrons) in a block of copper is actually about 0.5% of the speed of light (!), or more than 6,000,000 kph.  For metals with even more electrons, the fastest movers can be going so quickly that relativistic effects become important!

This was a very rough cut.  I'll try to return to this later, with other ways of thinking about it.


Monday, May 18, 2015

Book recommendations: Stuff Matters and The Disappearing Spoon

I've lamented the lack of good popularizations of condensed matter/solid state physics.  I do, however, have recommendations for two relatively recent books about materials and chemistry, which is pretty close.

The Disappearing Spoon, by Sam Kean, is a fun, engaging stroll across the periodic table, exploring the properties of the various chemical elements through the usually fascinating, sometimes funny, occasionally macabre histories of their discoveries and uses.  The title references joke spoons made from gallium that would melt (and fall to the bottom of the cup) when used to stir tea.  The tone is light and anecdotal, and the history is obscure enough that you haven't heard all the stories before.  Very fun.

Stuff Matters, by Mark Miodownik, is similar in spirit, though not quite so historical and containing more physics and materials science.  The author is a materials scientist who happens to be a gifted author and popularizer as well.  He's done a BBC three-episode series about materials (available here), another BBC series about modern technologies, and a TED lesson about why glass is transparent.

Wednesday, May 13, 2015

A matter of gravity

Gravity remains an enduring challenge in physics.  Newton had the insight that he could understand many phenomena (e.g., the falling of an apple, the orbit of Halley's comet) if the gravitational interaction between two objects is an attractive force proportional to the product of the objects' masses, and inversely proportional to the square of the distance between them ( \(F = - G M_{1}M_{2}/r^{2}\) ), and acts along the line between the objects.  The constant of proportionality, \(G\), is Newton's gravitational constant.   About 225 years later, Einstein had the insight that in more generality one should think of gravity as actually distorting space-time; what looks like a force is really a case of freely falling objects moving in the (locally) straightest trajectories that they can.  (Obligatory rubber sheet analogy here.)  In that theory, general relativity (GR), Newton's constant \(G\) again appears as a constant of proportionality that basically sets the scale for the amount of space-time distortion produced by a certain amount of stress-energy (rather than just good old-fashioned mass).  GR has been very successful so far, though we have reasons to believe that it is the classical limit of some still unknown quantum theory of gravity.  Whatever that quantum theory is, \(G\) must still show up to set the scale for the gravitational interaction.

It makes sense that we would like to know the numerical value of \(G\) as accurately and precisely as possible - seems like the first thing you'd like to understand, right?  The challenge is, as I've explained before, gravity is actually an incredibly weak force.  To measure it well in absolute numbers, you need an apparatus that can measure small forces while not being influenced by other, faaaaaar stronger forces like electromagnetism, and you need to do something like measure the force (or the counter-force that you need to apply to null out the gravitational force) as a function of different configurations of test masses (such as tungsten spheres). 

I'm revisiting this because of a couple (1, 2) of interesting papers that came out recently.  As I'd said in that 2010 post, the challenge in measuring \(G\) is so difficult that different groups have obtained nominally high precision measurements (precise out to the fourth decimal place, such as \(G = 6.6730 \pm 0.00029 \times 10^{-11}\) Nm2/kg2) that are mutually inconsistent with each other.  See this plot (Fig. 1 from arxiv:1505.01774).  The various symbols correspond to different published measurements of \(G\) over the last 35 years (!).  The distressing thing is that there does not seem to be much sign of convergence.  The recent papers are looking to see whether there is actually some periodicity to the results (as hinted by the sinusoid on the plot).  To be clear:  The authors are not suggesting that \(G\) really varies with a several year period - rather, they're exploring the possibility that there might be some unknown systematic effect that is skewing the results of some or all of the various measurement approaches.  As both teams of authors say, the best solution would be to come up with a very clean experimental scheme and run it, undisturbed, continuously for years at a time.  That's not easy or cheap.  It's important to note that this is what real, careful measurement science looks like, not some of the stuff that has made web headlines lately.

Wednesday, May 06, 2015

People you should've heard about: John Bardeen

If you ask the average person to name a physicist, chances are they'll mention Einstein, Hawking, and possibly Sheldon Cooper.  Maybe Richard Feynman, Brian Greene or (*sigh*) Michio Kaku.  I'd like to have an occasional series of posts pointing out people that should be well-known, but for some reason are not.  High up on that list:  John Bardeen, who is the only person one of only two people to win two Nobel prizes in the same field.

Bardeen, like many of his contemporaries, followed what would now be considered a meandering, unconventional trajectory into physics, starting out as an undergrad engineer at Wisconsin, working as a geophysicist, enrolling as a math grad student at Princeton, and eventually doing a doctoral thesis with Wigner worrying about electron-electron interactions in metals (resulting in these two papers about how much energy it takes to remove an electron from a metal, and how that can be strongly affected by the very last layer of atoms at the surface - in the 1980s this would be called "surface science" and now it would be called "nanoscience").

Bardeen was a quiet, brilliant person.  After WWII (during which he worked for the Navy), he went to Bell Labs, where he worked with Walter Brattain to invent the point contact transistor (and much more disagreeably with William Shockley), explaining the critical importance of "surface states" (special levels for the electrons in a semiconductor that exist at the surface, where the periodic potential of the lattice is terminated).  Shockley is viewed in hindsight as famously unpleasant as a co-worker/boss - Bardeen left Bell Labs in large part because of this and ended up at Illinois, where seven years later he worked with Bob Schrieffer and Leon Cooper to produce the brilliant BCS theory of superconductivity, earning his second Nobel.  (Shockley's borderline abusive management style is also responsible for the creation of modern Silicon Valley, but that's another story.)

During and after this period, Bardeen helped build the physics department of UIUC into a condensed matter physics powerhouse, a position it continues to hold.  He was very interested in the theory of charge density waves (special states where the electrons in a solid spontaneously take on a spatially periodic density), though according to Lillian Hoddeson's excellent book (see here, too) he had lost the intellectual flexibility of his youth by this time.  

Bardeen contributed greatly to our understanding and advancement of two whole classes of technologies that have reshaped the world (transistors and superconductors).  He was not a flamboyant personality like Feynman (after all, he was from the Midwest :-) ), and he was not a self-promoter (like Feynman), but he absolutely deserves greater notoriety and appreciation from the general public.