Saturday, June 22, 2024

What is turbulence? (And why are helicopters never quiet?)

Fluid mechanics is very often left out of the undergraduate physics curriculum.  This is a shame, as it's very interesting and directly relevant to many broad topics (atmospheric science, climate, plasma physics, parts of astrophysics).  Fluid mechanics is a great example of how it is possible to have comparatively simple underlying equations and absurdly complex solutions, and that's probably part of the issue.  The space of solutions can be mapped out using dimensionless ratios, and two of the most important are the Mach number (\(\mathrm{Ma} \equiv u/c_{s}\), where \(u\) is the speed of some flow or object, and \(c_{s}\) is the speed of sound) and the Reynolds number (\(\mathrm{Re} \equiv \rho u d/\mu\), where \(\rho\) is the fluid's mass density, \(d\) is some length scale, and \(\mu\) is the viscosity of the fluid). 

From Laurence Kedward, wikimedia commons

There is a nice physical interpretation of the Reynolds number.  It can be rewritten as \(\mathrm{Re} = (\rho u^{2})/(\mu u/d)\).  The numerator is the "dynamic pressure" of a fluid, the force per unit area that would be transferred to some object if a fluid of density \(\rho\) moving at speed \(u\) ran into the object and was brought to a halt.  This is in a sense the consequence of the inertia of the moving fluid, so this is sometimes called an inertial force.  The denominator, the viscosity multiplied by a velocity gradient, is the viscous shear stress (force per unit area) caused by the frictional drag of the fluid.  So, the Reynolds number is a ratio of inertial forces to viscous forces.  

When \(\mathrm{Re}\ll 1\), viscous forces dominate.  That means that viscous friction between adjacent layers of fluid tend to smooth out velocity gradients, and the velocity field \(\mathbf{u}(\mathbf{r},t) \) tends to be simple and often analytically solvable.  This regime is called laminar flow.  Since \(d\) is just some characteristic size scale, for reasonable values of density and viscosity for, say, water, microfluidic devices tend to live in the laminar regime.  

When \(\mathrm{Re}\gg 1\), frictional effects are comparatively unimportant, and the fluid "pushes" its way along.  The result is a situation where the velocity field is unstable to small perturbations, and there is a transition to turbulent flow.  The local velocity field has big, chaotic variations as a function of space and time.  While the microscopic details of \(\mathbf{u}(\mathbf{r},t)\) are often not predictable, on a statistical level we can get pretty far since mass conservation and momentum conservation can be applied to a region of space (the control volume or Eulerian approach).

Turbulent flow involves a cascade of energy flow down through eddies at length scales all the way down eventually to the mean free path of the fluid molecules.   This right here is why helicopters are never quiet.  Even if you started with a completely uniform downward flow of air below the rotor (enough of a momentum flux to support the weight of the helicopter), the air would quickly transition to turbulence, and there would be pressure fluctuations over a huge range of timescales that would translate into acoustic noise.  You might not be able to hear the turbine engine directly from a thousand feet away, but you can hear the resulting sound from the turbulent airflow.  

If you're interested in fluid mechanics, this site is fantastic, and their links page has some great stuff.

Friday, June 14, 2024

Artificial intelligence, extrapolation, and physical constraints

Disclaimer and disclosure:  The "arrogant physicist declaims about some topic far outside their domain expertise (like climate change or epidemiology or economics or geopolitics or....) like everyone actually in the field is clueless" trope is very overplayed at this point, and I've generally tried to avoid doing this.  Still, I read something related to AI earlier this week, and I wanted to write about it.  So, fair warning: I am not an expert about AI, machine learning, or computer science, but I wanted to pass this along and share some thoughts.  Feel even more free than usual to skip this and/or dismiss my views.

This is the series of essays, and here is a link to the whole thing in one pdf file.  The author works for OpenAI.  I learned about this from Scott Aaronson's blog (this post), which is always informative.

In a nutshell, the author basically says that he is one of a quite small group of people who really know the status of AI development; that we are within a couple of years of the development of artificial general intelligence; that this will lead essentially to an AI singularity as AGI writes ever-smarter versions of AGI; that the world at large is sleepwalking toward this and its inherent risks; and that it's essential that western democracies have the lead here, because it would be an unmitigated disaster if authoritarians in general and the Chinese government in particular should take the lead - if one believes in extrapolating exponential progressions, then losing the initiative rapidly translates into being hopelessly behind forever.

I am greatly skeptical of many aspects of this (in part because of the dangers of extrapolating exponentials), but it is certainly thought-provoking.  

I doubt that we are two years away from AGI.  Indeed, I wonder if our current approaches are somewhat analogous to Ptolemeiac epicycles.  It is possible in principle to construct extraordinarily complex epicyclic systems that can reproduce predictions of the motions of the planets to high precision, but actual newtonian orbital mechanics is radically more compact, efficient, and conceptually unified.  Current implementations of AI systems use enormous numbers of circuit elements that consume tens to hundreds of MW of electricity.  In contrast, your brain hosts a human-level intelligence, consumes about 20 W, and masses about 1.4 kg.  I just wonder if our current architectural approach is not the optimal one toward AGI.  (Of course, a lot of people are researching neuromorphic computing, so maybe that resolves itself.)

The author also seems to assume that whatever physical resources are needed for rapid exponential progress in AI will become available.  Huge numbers of GPUs will be made.  Electrical generating capacity and all associated resources will be there.  That's not obvious to me at all.  You can't just declare that vastly more generating capacity will be available in three years - siting and constructing GW-scale power plants takes years alone.  TSMC is about as highly motivated as possible to build their new facilities in Arizona, and the first one has taken three years so far, with the second one delayed likely until 2028.  Actual construction and manufacturing at scale cannot be trivially waved away.

I do think that AI research has the potential to be enormously disruptive.  It also seems that if a big corporation or nation-state thought that they could gain a commanding advantage by deploying something even if it's half-baked and the long-term consequences are unknown, they will 100% do it.  I'd be shocked if the large financial companies aren't already doing this in some form.  I also agree that broadly speaking as a species we are unprepared for the consequences of this research, good and bad.  Hopefully we will stumble forward in a way where we don't do insanely stupid things (like putting the WOPR in charge of the missiles without humans in the loop).   

Ok, enough of my uninformed digression.  Back to physics soon.

Update:  this is a fun, contrasting view by someone who definitely disagrees with Aschenbrenner about the imminence of AGI.

Sunday, June 02, 2024

Materials families: Halide perovskites

Looking back, I realized that I haven't written much about halide perovskites, which is quite an oversight given how much research impact they're having.  I'm not an expert, and there are multiple extensive review articles out there (e.g. here, here, here, here, here), so this will only be a very broad strokes intro, trying to give some context to why these systems are important, remarkable, and may have plenty of additional tricks to play.

From ACS Energy Lett. 5, 2, 604–610 (2020).

Perovskites are a class of crystals based on a structural motif (an example is ABX3, originally identified in the mineral CaTiO3, though there are others) involving octahedrally coordinated metal atoms.  As shown in the figure, each B atom is in the center of an octahedron defined by six X atoms.  There are many flavors of purely inorganic perovskites, including the copper oxide semiconductors and various piezo and ferroelectric oxides.  

The big excitement in recent years, though, involves halide perovskites, in which the X atom = Cl, Br, I, the B atom is most often Pb or Sn.  These materials are quite ionic, in the sense that the B atom is in the 2+ oxidation state, the X atom is in the 1- oxidation state, and whatever is in the A site is in the 1+ oxidation state (whether it's Cs+ or a molecular ion like methylammonium (MA = [CH3NH3]+) or foramidinium (FA = [HC(NH2)2]+).  

From Chem. Rev. 123, 13, 8154–8231 (2023).

There is an enormous zoo of materials based on these building blocks, made even more rich by the capability of organic chemists to toss in various small organic, covalent ligands to alter spacings between the components (and hence electronic overlap and bandwidths), tilt or rotate the octahedra, add in chirality, etc.  Forms that are 3D, effectively 2D (layers of corner-sharing octahedra), 1D, and "OD" (with isolated octahedra) exist.  Remarkably:

  • These materials can be processed in solution form, and it's possible to cast highly crystalline films.
  • Despite the highly ionic character of much of the bonding, many of these materials are semiconductors, with bandgaps in the visible.
  • Despite the differences in what chemists and semiconductor physicists usually mean by "pure", these materials can be sufficiently clean and free of the wrong kinds of defects that it is possible to make solar cells with efficiencies greater than 26% (!) (and very bright light emitting diodes).  
These features make the halide perovskites extremely attractive for possible applications, especially in photovoltaics and potentially light sources (even quantum emitters).  They are seemingly much more forgiving (in terms of high carrier mobility, vulnerability to disorder, and having a high dielectric polarizability and hence lower exciton binding energy and greater ease of charge extraction) than most organic semiconductors.  The halide perovskites do face some serious challenges (chemical stability under UV illumination and air/moisture exposure; the unpleasantness of Pb), but their promise is enormous

Sometimes nature seems to provide materials with particularly convenient properties.  Examples include water and the fact that ordinary ice is less dense than the liquid form; silicon and its outstanding oxide; gallium arsenide and the fact that it can be grown with great purity and stoichiometry even in an extremely As rich environment; I'm sure commenters can provide many more.  The halide perovskites seem to be another addition to this catalog, and as material properties continue to improve, condensed matter physicists are going to be looking for interesting things to do in these systems.