Search This Blog

Sunday, February 22, 2026

AI/ML, multiscale modeling, and emergence

I've been attending a lot of talks lately about AI/machine learning and multiscale modeling for materials design and control.  This is a vast, rapidly evolving research area, so here is a little background and a few disorganized thoughts.  

For a recent review article about AI and materials discovery, see here.  There is a ton of work being done pursuing the grand goal of inverse design - name some desired properties, and have AI/ML formulate a material that fits those requirements and is actually synthesizable.  Major companies with publicly known efforts include Google Deepmind and GNoMEMicrosoft, Meta working on catalysts, Toyota Research Institute, IBM, and I'm certain that I'm missing major players.  There are also a slew of startup companies on this topic (e.g. Periodic).  

In addition to materials design and discovery, there is enormous effort being put into using AI/ML to bridge across length and timescales.  Quantum chemistry methods can look at microscopic physics and chemistry, for example, but extending this to macroscopic system sizes with realistic disorder is often computationally intractable.  There are approaches like time-dependent DFT and DMFT to try to capture dynamics, but following dynamics even as long as picoseconds is hard.  Using microscopic methods and ML to try to compute and then parametrize force fields between atoms (for example), one can look at larger systems and longer timescales using molecular dynamics for atomic motions.  However, getting from there to, e.g., the Navier-Stokes equations or understanding phase boundaries, is very difficult.  (At the same time, there are approaches that use AI/ML to learn about the solutions of partial differential equations, so that one can, for example, compute good fluid flows quickly without actually having to solve the N-S equations - see here.) 

We want to keep coarse-graining (looking at larger scales), while maintaining the microscopic physics constraints so that the results are accurate.  There seems to be a lot of hope that either by design or by the action of the AI/ML tools themselves we can come up with descriptors that are good at capturing the essential physics as we move to larger and larger scales.  To use a fluids example, somehow we are hoping that these tools will naturally capture that at scales much larger than one water molecule, it makes sense to track density, temperature, velocity fields, surface tension, liquid-vapor interfaces, etc.  

From the always fun xkcd
One rough description of emergence is the idea that at larger scales and numbers of constituents, new properties appear for the collective system that are extremely difficult to predict from the microscopic rules governing the constituents.  For example, starting from the Schroedinger equation and basic quantum mechanics, it's very hard to determine that snowflakes tend to have 6-fold symmetry and ice will float in water, even though the latter are of course consequences of the former.  A nice article about emergence in physics is here.  

It feels to me like in some AI/ML endeavors, we are hoping that these tools will figure out how emergence works better than humans have been able to do.  This is certainly a worthy challenge, and it may well succeed in a lot of systems, but then we may have the added meta-challenge of trying to understand how our tools did that.  Physics-informed and structured ML will hopefully take us well beyond the situation in the xkcd comic shown here.  



No comments: