*DeePCG: constructing coarse-grained models via deep neural networks.*L Zhang, J Han, H Wang, R Car, Weinan E.

**arXiv:1802.08549v2**[physics.chem-ph]

Contributed by Jesper Madsen

The idea of “learning” a molecular force field (FF) using neutral networks can be traced back to Blank et al. in 1995.[1] Modern variations (reviewed recently by Behler[2]), such as the DeePCG scheme[3] that I highlight here, seem to have two key innovations to set them apart from earlier work: network depth and atomic environment descriptors. The latter was the topic of my recent highlight and Zhang et al.[3] take advantage of similar ideas.

Zhang et al. simulate liquid water using

*ab initio*molecular dynamics (AIMD) on the DFT/PBE0 level of theory in order to train a coarse-grained (CG) molecular water model. The training is done by a standard protocol used in CGing where mean forces are fitted by minimizing a loss-function (the natural choice is the residual sum of squares) over the sampled configurations. CGing liquid water is difficult because of the necessity of many-body contributions to interactions, especially so upon integrating out degrees-of-freedom. One would therefore expect that a FF capable of capturing such many-body effects to perform well, just as DeePCG does, and I think this is a very nice example of exactly how much can be gained by using faithful representations of atomic neighborhoods instead of radially symmetric pair potentials. Recall that traditional force-matching, while provably exact in the limit of the complete many-body expansion,[4] still shows non-negligible deviations from the target distributions for most simple liquids when standard approximations are used.
FF transferability, however, is likely where the current
grand challenge is to be found. Zhang et al. remark that it would be convenient
to have an accurate yet cheap (e.g., CG) model for describing phase transitions
in water. They do not attempt this in the current preprint paper, but I suspect
that it is not *that* easy to
make a decent CG model that can correctly get subtle long-range correlations
right at various densities, let alone different phases of water and ice,
coexistences, interfaces, impurities (non-water moieties), etc. Machine-learnt
potentials continuously demonstrate excellent accuracy over the
parameterization space of states or configurations, but for transferability and
extrapolations, we are still waiting to see how far they can get.

### References

[1]

*Neural network models of potential energy surfaces*. TB Blank, SD Brown, AW Calhoun, DJ Doren.**J Chem Phys**103, 4129 (1995)
[2]

*Perspective: Machine learning potentials for atomistic simulations.*J Behler.**J Chem Phys**145, 170901 (2016)
[3]

*DeePCG: constructing coarse-grained models via deep neural networks.*L Zhang, J Han, H Wang, R Car, Weinan E.**arXiv:1802.08549v2**[physics.chem-ph]
[4]

*The multiscale coarse-graining method. I. A rigorous bridge between atomistic and coarse-grained models.*WG Noid, J-W Chu, GS Ayton, V Krishna, S Izvekov, GA Voth, A Das, HC Andersen.**J Chem Phys**128, 244114 (2008)