One of the most important achievements in 20th century physics and chemistry is density functional theory (DFT). In 1964, Walter Kohn and Pierre Hohenberg proved a rather amazing result: If you want to know the ground state electronic properties of any condensed quantum electronic system (e.g., a solid, or a molecule), you can get all of that information (even in a complicated, interacting, many-body system!) just from knowing the ground state electronic density everywhere, \(n(\mathbf{r})\). That is, any property you might care about (e.g., the ground state energy) can be computed from \(n(\mathbf{r})\) as a functional \(F[n(\mathbf{r})]\). (A functional is a function of a function - in this case \(F\) depends in detail on the value of \(n\) everywhere.) The Hohenberg-Kohn paper is the most-cited physics paper of all time, suggesting its importance. So, truly figuring out the electronic structure of molecules and materials just becomes a matter of figuring out an extremely good approximation to \(n(\mathbf{r})\).
Moreover, Kohn and Lu Sham then went further, and found a practical calculational approach that lets you work with an effective system of equations to try to find \(n(\mathbf{r})\) and the ground state energy. In this formulation, they write the total energy functional \(E[n(\mathbf{r}]\) as a sum of three pieces: a kinetic energy term that may be written as a comparatively simple expression; a potential energy term that is easy to write and simply interpreted as the Coulomb repulsion; and the third bit, the "exchange-correlation functional", \(E_{\mathrm{xc}}[n(\mathbf{r})]\), which no one knows how to write down analytically.
You might think that not knowing how to write down an exact expression for \(E_{\mathrm{xc}}[n(\mathbf{r})]\) would be a huge issue. However, people have come up with many different approximation methods, and DFT has been hugely useful in understanding the electronic properties of solids and molecules.
In recent years, though, some people have been wondering if it's possible to use "machine learning" - essentially having a computer derive an extremely good look-up table or interpolation - to approach an exact description of \(E_{\mathrm{xc}}[n(\mathbf{r})]\). This is not a crazy idea at all, based on engineering history and dimensional analysis. For example, actually writing down an analytical expression for the pressure drop of water flowing through a rough pipe is not generally possible. However, dimensional analysis tells us that the pressure drop depends on just a couple of dimensionless ratios, and a zillion experiments can be run to map out a look-up table for what that un-writable multivariable function looks like. Perhaps with computers that are becoming incredibly good at identifying patterns and underlying trends, we can do something similar with \(E_{\mathrm{xc}}[n(\mathbf{r})]\). One of the main groups pursuing this kind of idea is that of Keiran Burke, and last week a preprint appeared on the arxiv from others arguing that machine learning can be useful in another many-body approach, dynamical mean field theory. Who knows: Maybe someday the same types of algorithms that guess songs for you on Pandora and book suggestions on Amazon will pave the way for real progress in "materials by design"!
Here is an example of this kind of approach in soft condensed matter :
ReplyDeletehttp://www.nature.com/nmat/journal/v12/n4/full/nmat3543.html
These people from Chicago have used some kind of machine learning to optimize the shape of particles to obtain a certain response under pressure.