Contributed by Martin Korth.

It's an open question: Can more parameters efficiently replace knowledge about the underlying physical behaviour? Some examples:

Classical potentials usually have a fixed functional form, which is chosen to capture the basics of (inter)molecular interactions. Neural networks[1], Gaussian approximation potentials[2] and other machine-learning approaches do not rely on fixed potentials, but it's unclear whether systems with complex chemistry (for instance involving more than a handful of different elements) can be efficiently treated this way in general.

One step up on the theory ladder, semi-empirical quantum mechanical (SQM) methods relying more heavily on parametrization (like PM6, PM7) seem to perform overall as good as SQM methods trying to capture more of the correct physics by introducing orthogonalization corrections (OMx), but if one looks at complicated chemistry, the latter ones turn out to be more robust.[3]

Another step up, the highlighted paper by Schwabe comes into play: Also in the field of density functional theory (DFT) exchange-correlation (XC) functional development, people have tried both pathways; more physics (like including dispersion in a DFT-D type fashion[4]) and more parameters (how some Minnesota type functionals by Truhlar and co-workers capture -- at least mid-range -- dispersion[5]).

Schwabe now assesses a number of functionals for isomerization reactions in which heteroatoms are systematically replaced with heavier atoms of the same group. With this setup he has found a clever way to find out whether the performance of a method depends on the elements involved and thus the external potential - which should not be the case at least for the 'true' functional. Schwabe indeed finds no such dependence except for one case, M11-L, a functional on the more-parameter side of things, thereby suggesting that like for SQM methods, the more-physics pathway might be the safer road to follow.

It would be interesting to see how SQM methods (including SCC-DFTB) perform on Schwabe's set, though in this case not a dependence on the external potential but element-specific parametrization effects would be investigated (in the past we did not find element-specific trends for PM6[3] and also no such trends for WFT/DFT methods -- including some Minnesota functionals -- when we benchmarked 'mindless'[6]).

References:

[1] J. Behler, Representing Potential-Energy Surfaces by High-Dimensional Neural Network Potentials, J. Phys.: Condens. Matter 26 (2014) 183001.

[2]
A. P. Bartok, M. C. Payne, R. Kondor, G. Csanyi, Gaussian Approximation
Potentials: The Accuracy of Quantum Mechanics, without the Electrons,

*Physical Review Letters***104**136403 (2010)
[3] M. Korth, W. Thiel, Benchmarking Semiempirical Methods for Thermochemistry, Kinetics and Noncovalent Interactions: OMx Methods are Almost as Accurate and Robust as DFT-GGA Methods for Organic Molecules. J. Chem. Theory Comput., 2011, 7, 2929.

[4] S. Grimme, Density functional theory with London dispersion corrections, WIREs: Comp. Mol. Sci. 2011, 1, 211.

[5] R. Peverati, D. G. Truhlar, The Quest for a Universal Density Functional: The Accuracy of Density Functionals Across a Broad Spectrum of
Databases in Chemistry and Physics, Philosophical Transactions of the Royal Society A 372, 20120476/1-51 (2014).

[6] M. Korth, S. Grimme, 'Mindless' DFT Benchmarking, J. Chem. Theory Comput., 2009, 5, 993.