1932

Abstract

In the past two decades, machine learning potentials (MLPs) have reached a level of maturity that now enables applications to large-scale atomistic simulations of a wide range of systems in chemistry, physics, and materials science. Different machine learning algorithms have been used with great success in the construction of these MLPs. In this review, we discuss an important group of MLPs relying on artificial neural networks to establish a mapping from the atomic structure to the potential energy. In spite of this common feature, there are important conceptual differences among MLPs, which concern the dimensionality of the systems, the inclusion of long-range electrostatic interactions, global phenomena like nonlocal charge transfer, and the type of descriptor used to represent the atomic structure, which can be either predefined or learnable. A concise overview is given along with a discussion of the open challenges in the field.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-physchem-082720-034254
2022-04-20
2024-06-13
Loading full text...

Full text loading...

/deliver/fulltext/physchem/73/1/annurev-physchem-082720-034254.html?itemId=/content/journals/10.1146/annurev-physchem-082720-034254&mimeType=html&fmt=ahah

Literature Cited

  1. 1. 
    book 2018. Machine Learning: An Overview of Artificial Intelligence Scotts Valley, CA: CreateSpace
    [Google Scholar]
  2. 2. 
    Alpaydin E. 2020. Introduction to Machine Learning Cambridge, MA: MIT Press
    [Google Scholar]
  3. 3. 
    Mater AC, Coote ML. 2019. Deep learning in chemistry. J. Chem. Inf. Model. 59:2545–59
    [Google Scholar]
  4. 4. 
    Larranaga P, Calvo B, Santana R, Bielza C, Galdiano J et al. 2005. Machine learning in bioinformatics. Brief. Bioinform. 7:86–112
    [Google Scholar]
  5. 5. 
    von Lilienfeld OA, Burke K. 2020. Retrospective on a decade of machine learning for chemical discovery. Nat. Commun. 11:4895
    [Google Scholar]
  6. 6. 
    Mamoshina P, Vieira A, Putin E, Zhavoronkov A 2016. Applications of deep learning in biomedicine. Mol. Pharm. 13:1445–54
    [Google Scholar]
  7. 7. 
    Libbrecht MW, Noble WS. 2015. Machine learning applications in genetics and genomics. Nat. Rev. Genet. 16:6321–32
    [Google Scholar]
  8. 8. 
    Selvaratnam B, Koodali RT. 2021. Machine learning in experimental materials chemistry. Catal. Today 371:77–84
    [Google Scholar]
  9. 9. 
    Agrawal A, Choudhary A. 2016. Perspective: materials informatics and big data: realization of the “fourth paradigm” of science in materials science. APL Mater 4:053208
    [Google Scholar]
  10. 10. 
    Handley CM, Popelier PLA. 2010. Potential energy surfaces fitted by artificial neural networks. J. Phys. Chem. A 114:3371–83
    [Google Scholar]
  11. 11. 
    Behler J. 2011. Neural network potential-energy surfaces in chemistry: a tool for large-scale simulations. Phys. Chem. Chem. Phys. 13:17930–55
    [Google Scholar]
  12. 12. 
    Behler J. 2016. Perspective: machine learning potentials for atomistic simulations. J. Chem. Phys. 145:170901
    [Google Scholar]
  13. 13. 
    Dral PO. 2020. Quantum chemistry in the age of machine learning. J. Phys. Chem. Lett. 11:2336–47
    [Google Scholar]
  14. 14. 
    Deringer VL, Caro MA, Csányi G 2019. Machine learning interatomic potentials as emerging tools for materials science. Adv. Mater. 31:1902765
    [Google Scholar]
  15. 15. 
    Noé F, Tkatchenko A, Müller KR, Clementi C. 2020. Machine learning for molecular simulation. Ann. Rev. Phys. Chem. 71:361–90
    [Google Scholar]
  16. 16. 
    Zubatiuk T, Isayev O. 2021. Development of multimodal machine learning potentials: toward a physics-aware artificial intelligence. Acc. Chem. Res. 54:1575–85
    [Google Scholar]
  17. 17. 
    Zhang J, Lei YK, Zhang Z, Chang J, Li M et al. 2020. A perspective on deep learning for molecular modeling and simulations. J. Phys. Chem. A 124:6745–63
    [Google Scholar]
  18. 18. 
    Unke OT, Chmiela S, Sauceda HE, Gastegger M, Poltavsky I et al. 2021. Machine learning force fields. Chem. Rev. 121:10142–86
    [Google Scholar]
  19. 19. 
    Friederich P, Häse F, Proppe J, Aspuru-Guzik A. 2021. Machine-learned potentials for next-generation matter simulations. Nat. Mater. 20:6750–61
    [Google Scholar]
  20. 20. 
    Grisafi A, Nigam J, Ceriotti M. 2021. Multi-scale approach for the prediction of atomic scale properties. Chem. Sci. 12:2078–90
    [Google Scholar]
  21. 21. 
    Blank TB, Brown SD, Calhoun AW, Doren DJ 1995. Neural network models of potential energy surfaces. J. Chem. Phys. 103:104129–37
    [Google Scholar]
  22. 22. 
    Lorenz S, Groß A, Scheffler M 2004. Representing high-dimensional potential-energy surfaces for reactions at surfaces by neural networks. Chem. Phys. Lett. 395:210–15
    [Google Scholar]
  23. 23. 
    Behler J, Parrinello M. 2007. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett. 98:146401
    [Google Scholar]
  24. 24. 
    Schütt KT, Sauceda HE, Kindermans PJ, Tkatchenko A, Müller KR 2018. SchNet—a deep learning architecture for molecules and materials. J. Chem. Phys. 148:241722
    [Google Scholar]
  25. 25. 
    Unke OT, Meuwly M. 2019. PhysNet: a neural network for predicting energies, forces, dipole moments, and partial charges. J. Chem. Theory Comput. 15:3678–93
    [Google Scholar]
  26. 26. 
    Bartók AP, Payne MC, Kondor R, Csányi G 2010. Gaussian approximation potentials: the accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett. 104:136403
    [Google Scholar]
  27. 27. 
    Bartók AP, Csányi G. 2015. Gaussian approximation potentials: a brief tutorial introduction. Int. J. Quant. Chem. 115:1051–57
    [Google Scholar]
  28. 28. 
    Chmiela S, Sauceda HE, Poltavsky I, Müller KR, Tkatchenko A. 2019. sGDML: constructing accurate and data efficient molecular force fields using machine learning. Comput. Phys. Commun. 240:38–45
    [Google Scholar]
  29. 29. 
    Thompson AP, Swiler LP, Trott CR, Foiles SM, Tucker GJ. 2015. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials. J. Comput. Phys. 285:316–30
    [Google Scholar]
  30. 30. 
    Wood MA, Thompson AP. 2018. Extending the accuracy of the SNAP interatomic potential form. J. Chem. Phys. 148:241721
    [Google Scholar]
  31. 31. 
    Shapeev AV. 2016. Moment tensor potentials: a class of systematically improvable interatomic potentials. Multiscale Model. Simul. 14:1153–73
    [Google Scholar]
  32. 32. 
    Drautz R. 2019. Atomic cluster expansion for accurate and transferable interatomic potentials. Phys. Rev. B 99:014104
    [Google Scholar]
  33. 33. 
    Allen A, Dusson G, Ortner C, Csanyi G. 2021. Atomic permutationally invariant polynomials for fitting molecular force fields. Mach. Learn. Sci. Techn. 2:025017
    [Google Scholar]
  34. 34. 
    Balabin RM, Lomakina EI. 2011. Support vector machine regression (LS-SVM)—an alternative to artificial neural networks (ANNs) for the analysis of quantum chemistry data?. Phys. Chem. Chem. Phys. 13:11710
    [Google Scholar]
  35. 35. 
    Cybenko G. 1989. Approximation by superpositions of a sigmoidal function. Math. Control Signals Syst. 2:303–14
    [Google Scholar]
  36. 36. 
    Ko TW, Finkler JA, Goedecker S, Behler J. 2021. General-purpose machine learning potentials capturing nonlocal charge transfer. Acc. Chem. Res. 54:808–17
    [Google Scholar]
  37. 37. 
    Behler J. 2021. Four generations of high-dimensional neural network potentials. Chem. Rev. 121:10037–72
    [Google Scholar]
  38. 38. 
    Ko TW, Finkler JA, Goedecker S, Behler J 2021. A fourth-generation high-dimensional neural network potential with accurate electrostatics including non-local charge transfer. Nat. Commun. 12:398
    [Google Scholar]
  39. 39. 
    Rupp M, Tkatchenko A, Müller KR, von Lilienfeld OA 2012. Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett. 108:058301
    [Google Scholar]
  40. 40. 
    Zeng M, Kumar JN, Zeng Z, Savitha R, Chandrasekhar VR, Hippalgaonkar K. 2018. Graph convolutional neural networks for polymers property prediction. arXiv:1811.06231 [cond-mat.mtrl-sci]
  41. 41. 
    Omprakash P, Manikandan B, Sandeep A, Shrivastava R, Viswesh P, Panemangalore DB. 2021. Graph representational learning for bandgap prediction in varied perovskite crystals. Comput. Mater. Sci. 196:110530
    [Google Scholar]
  42. 42. 
    Miccio LA, Schwartz GA. 2020. From chemical structure to quantitative polymer properties prediction through convolutional neural networks. Polymer 193:122341
    [Google Scholar]
  43. 43. 
    Behler J. 2017. First principles neural network potentials for reactive simulations of large molecular and condensed systems. Angew. Chem. Int. Ed. 56:12828
    [Google Scholar]
  44. 44. 
    Brown DFR, Gibbs MN, Clary DC. 1996. Combining ab initio computations, neural networks, and diffusion Monte Carlo: an efficient method to treat weakly bound molecules. J. Chem. Phys. 105:7597
    [Google Scholar]
  45. 45. 
    No KT, Chang BH, Kim SY, Jhon MS, Scheraga HA 1997. Description of the potential energy surface of the water dimer with an artificial neural network. Chem. Phys. Lett. 271:152–56
    [Google Scholar]
  46. 46. 
    Bittencourt ACP, Prudente FV, Vianna JDM. 2004. The fitting of potential energy and transition moment functions using neural networks: transition probabilities in OH (A2 to X2). Chem. Phys. 297:153–61
    [Google Scholar]
  47. 47. 
    Lee HM, Raff LM. 2008. Cis → trans, trans → cis isomerizations and N=O bond dissociation of nitrous acid (HONO) on an ab initio potential surface obtained by novelty sampling and feed-forward neural network fitting. J. Chem. Phys. 128:194310
    [Google Scholar]
  48. 48. 
    Behler J, Reuter K, Scheffler M. 2008. Nonadiabatic effects in the dissociation of oxygen molecules at the Al(111) surface. Phys. Rev. B 77:115421
    [Google Scholar]
  49. 49. 
    Goikoetxea I, Beltrán J, Meyer J, Juaristi JI, Alducin M, Reuter K 2012. Non-adiabatic effects during the dissociative adsorption of O2 at Ag(111)? A first-principles divide and conquer study. New J. Phys. 14:013050
    [Google Scholar]
  50. 50. 
    Manzhos S, Yamashita K 2010. A model for the dissociative adsorption of N2O on Cu(100) using a continuous potential energy surface. Surf. Sci. 604:554–60
    [Google Scholar]
  51. 51. 
    Cho KW, No KT, Scheraga HA. 2002. A polarizable force field for water using an artificial neural network. J. Mol. Struct. 641:77–91
    [Google Scholar]
  52. 52. 
    Gassner H, Probst M, Lauenstein A, Hermansson K 1998. Representation of intermolecular potential functions by neural networks. J. Phys. Chem. A 102:4596–605
    [Google Scholar]
  53. 53. 
    Hobday S, Smith R, Belbruno J 1999. Applications of neural networks to fitting interatomic potential functions. Model. Simul. Mater. Sci. Eng. 7:397–412
    [Google Scholar]
  54. 54. 
    Manzhos S, Carrington T Jr. 2006. Using neural networks to represent potential surfaces as sums of products. J. Chem. Phys. 125:194105
    [Google Scholar]
  55. 55. 
    Manzhos S, Carrington T Jr. 2007. Using redundant coordinates to represent potential energy surfaces with lower-dimensional functions. J. Chem. Phys. 127:014103
    [Google Scholar]
  56. 56. 
    Manzhos S, Carrington T Jr. 2008. Using neural networks, optimized coordinates, and high-dimensional model representations to obtain a vinyl bromide potential surface. J. Chem. Phys. 129:224104
    [Google Scholar]
  57. 57. 
    Malshe M, Narulkar R, Raff LM, Hagan M, Bukkapatnam S et al. 2009. Development of generalized potential-energy surfaces using many-body expansions, neural networks, and moiety energy approximations. J. Chem. Phys. 130:184102
    [Google Scholar]
  58. 58. 
    Brown A, Braams BJ, Christoffel K, Jin Z, Bowman JM 2003. Classical and quasiclassical spectral analysis of using an ab initio potential energy surface. J. Chem. Phys. 119:8790
    [Google Scholar]
  59. 59. 
    Braams BJ, Bowman JM. 2009. Permutationally invariant potential energy surfaces in high dimensionality. Int. Rev. Phys. Chem. 28:577–606
    [Google Scholar]
  60. 60. 
    Jiang B, Guo H. 2013. Permutation invariant polynomial neural network approach to fitting potential energy surfaces. J. Chem. Phys. 139:054112
    [Google Scholar]
  61. 61. 
    Li J, Jiang B, Guo H. 2013. Permutation invariant polynomial neural network approach to fitting potential energy surfaces. II. Four-atom systems. J. Chem. Phys. 139:204103
    [Google Scholar]
  62. 62. 
    Behler J, Lorenz S, Reuter K. 2007. Representing molecule-surface interactions with symmetry-adapted neural networks. J. Chem. Phys. 127:014705
    [Google Scholar]
  63. 63. 
    Langer MF, Goessmann A, Rupp M 2020. Representations of molecules and materials for interpolation of quantum-mechanical simulations via machine learning. arXiv:2003.12081 [physics.comp-ph]
  64. 64. 
    Himanen L, Jägera MOJ, Morooka EV, Canova FF, Ranawat YS et al. 2020. DScribe: library of descriptors for machine learning in materials science. Comput. Phys. Commun. 247:106949
    [Google Scholar]
  65. 65. 
    Gastegger M, Schwiedrzik L, Bittermann M, Berzsenyi F, Marquetand P. 2018. WACSF—weighted atom-centered symmetry functions as descriptors in machine learning potentials. J. Chem. Phys. 148:241709
    [Google Scholar]
  66. 66. 
    Zhang Y, Hu C, Jiang B. 2019. Embedded atom neural network potentials: efficient and accurate machine learning with a physically inspired representation. J. Phys. Chem. Lett. 10:4962–67
    [Google Scholar]
  67. 67. 
    Behler J. 2011. Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys. 134:074106
    [Google Scholar]
  68. 68. 
    Behler J. 2015. Constructing high-dimensional neural network potentials: a tutorial review. Int. J. Quantum Chem. 115:1032–50
    [Google Scholar]
  69. 69. 
    Singraber A, Morawietz T, Behler J, Dellago C. 2019. Parallel multi-stream training of high-dimensional neural network potentials. J. Chem. Theory Comput. 15:3075–92
    [Google Scholar]
  70. 70. 
    Khorshidi A, Peterson AA. 2016. Amp: a modular approach to machine learning in atomistic simulations. Comput. Phys. Commun. 207:310–24
    [Google Scholar]
  71. 71. 
    Smith JS, Isayev O, Roitberg AE. 2017. ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci. 8:3192–203
    [Google Scholar]
  72. 72. 
    Liu M, Kitchin JR. 2020. SingleNN: a modified Behler-Parrinello neural network with shared weights for atomistic simulations with transferability. J. Phys. Chem. C 124:17811–18
    [Google Scholar]
  73. 73. 
    Profitt TA, Pearson JK. 2019. A shared-weight neural network architecture for predicting molecular properties. Phys. Chem. Chem. Phys. 21:26175
    [Google Scholar]
  74. 74. 
    Imbalzano G, Anelli A, Giofre D, Klees S, Behler J, Ceriotti M. 2018. Automatic selection of atomic fingerprints and reference configurations for machine-learning potentials. J. Chem. Phys. 148:241730
    [Google Scholar]
  75. 75. 
    Mahoney MW, Drineas P. 2009. CUR matrix decompositions for improved data analysis. PNAS 106:697–702
    [Google Scholar]
  76. 76. 
    Selvaratnam B, Koodali RT, Miro P. 2020. Application of symmetry functions to large chemical spaces using a convolutional neural network. J. Chem. Inf. Model. 160:1928–35
    [Google Scholar]
  77. 77. 
    Jovan Jose KV, Artrith N, Behler J 2012. Construction of high-dimensional neural network potentials using environment-dependent atom pairs. J. Chem. Phys. 136:194111
    [Google Scholar]
  78. 78. 
    Yao K, Herr JE, Brown SN, Parkhill J. 2017. Intrinsic bond energies from a bonds-in-molecules neural network. J. Phys. Chem. Lett. 8:2689–94
    [Google Scholar]
  79. 79. 
    Hansen K, Biegler F, Ramakrishnan R, Pronobis W, von Lilienfeld OA et al. 2015. Machine learning predictions of molecular properties: accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett. 6:2326–31
    [Google Scholar]
  80. 80. 
    Glick ZL, Metcalf DP, Koutsoukas A, Spronk SA, Cheney DL, Sherrill CD. 2020. AP-Net: an atomic-pairwise neural network for smooth and transferable interaction potentials. J. Chem. Phys. 153:044112
    [Google Scholar]
  81. 81. 
    Zhang L, Han J, Wang H, Car R, E W 2018. Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Phys. Rev. Lett. 120:143001
    [Google Scholar]
  82. 82. 
    Han J, Zhang L, Car R, E W 2018. Deep potential: a general representation of a many-body potential energy surface. Commun. Comput. Phys. 23:629–39
    [Google Scholar]
  83. 83. 
    Zhang L, Han J, Wang H, Saidi WA, Car R, E W 2018. End-to-end symmetry preserving inter-atomic potential energy model for finite and extended systems. Proceedings of the 32nd International Conference on Neural Information Processing Systems S Bengio, HM Wallach, H Larochelle, K Grauman, N Cesa-Bianchi 4441–51 New York: ACM
    [Google Scholar]
  84. 84. 
    Daw M, Foiles S, Baskes M 1993. The embedded-atom method: a review of theory and applications. Mater. Sci. Rep. 9:251
    [Google Scholar]
  85. 85. 
    Duvenaud D, Maclaurin D, Aguilera-Iparraguirre J, Gomez-Bombarelli R, Hirzel T et al. 2015. Convolutional networks on graphs for learning molecular fingerprints. Proceedings of the 28th International Conference on Neural Information Processing Systems C Cortes, DD Lee, M Sugiyama, R Garnett 2224–32 New York: ACM
    [Google Scholar]
  86. 86. 
    Rogers D, Hahn M. 2010. Extended-connectivity fingerprints. J. Chem. Inf. Model. 50:742–54
    [Google Scholar]
  87. 87. 
    Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE. 2017. Neural message passing for quantum chemistry. Proceedings of the 34th International Conference on Machine Learning1263–72 Sydney, Aust: JMLR
    [Google Scholar]
  88. 88. 
    Schütt KT, Arbabzadah F, Chmiela S, Müller KR, Tkatchenko A. 2017. Quantum-chemical insights from deep tensor neural networks. Nat. Commun. 8:13890
    [Google Scholar]
  89. 89. 
    Schütt K, Kindermans PJ, Felix HES, Chmiela S, Tkatchenko A, Müller KR 2017. SchNet: a continuous-filter convolutional neural network for modeling quantum interactions. Proceedings of the 28th International Conference on Neural Information Processing Systems U von Luxburg, I Guyon, S Bengio, H Wallach, R Fergus 992–1002 New York: ACM
    [Google Scholar]
  90. 90. 
    Lubbers N, Smith JS, Barros K. 2018. Hierarchical modeling of molecular energies using a deep neural network. J. Chem. Phys. 148:241715
    [Google Scholar]
  91. 91. 
    Zubatyuk R, Smith JS, Leszczynski J, Isayev O 2019. Accurate and transferable multitask prediction of chemical properties with an atoms-in-molecules neural network. Sci. Adv. 5:eaav6490
    [Google Scholar]
  92. 92. 
    Kronik L, Tkatchenko A. 2014. Understanding molecular crystals with dispersion-inclusive density functional theory: pairwise corrections and beyond. Acc. Chem. Res. 47:3208–16
    [Google Scholar]
  93. 93. 
    Houlding S, Liem SY, Popelier PLA. 2007. A polarizable high-rank quantum topological electrostatic potential developed using neural networks: molecular dynamics simulations on the hydrogen fluoride dimer. Int. J. Quantum Chem. 107:2817–27
    [Google Scholar]
  94. 94. 
    Handley CM, Popelier PLA. 2009. Dynamically polarizable water potential based on multipole moments trained by machine learning. J. Chem. Theory Comput. 5:1474–89
    [Google Scholar]
  95. 95. 
    Nebgen B, Lubbers N, Smith JS, Sifain A, Lokhov A et al. 2018. Transferable dynamic molecular charge assignment using deep neural networks. J. Chem. Theory Comput. 14:4687–98
    [Google Scholar]
  96. 96. 
    Gastegger M, Behler J, Marquetand P. 2017. Machine learning molecular dynamics for the simulation of infrared spectra. Chem. Sci. 8:6924
    [Google Scholar]
  97. 97. 
    Zubatyuk R, Smith JS, Nebgen BT, Tretiak S, Isayev O 2021. Teaching a neural network to attach and detach electrons from molecules. Nat. Commun. 12:4870
    [Google Scholar]
  98. 98. 
    Metcalf DP, Jiang A, Spronk SA, Cheney DL, Sherrill CD. 2021. Electron-passing neural networks for atomic charge prediction in systems with arbitrary molecular charge. J. Chem. Inf. Model. 61:115–22
    [Google Scholar]
  99. 99. 
    Cuevas-Zuvira B, Pacios LF. 2021. Machine learning of analytical electron density in large molecules through message-passing. J. Chem. Inf. Model. 61:2658–66
    [Google Scholar]
  100. 100. 
    Wang Y, Fass J, Stern CD, Luo K, Chodera J 2019. Graph nets for partial charge prediction. arXiv:1909.07903 [physics.comp-ph]
  101. 101. 
    Wang J, Cao D, Tang C, Xu L, He Q et al. 2021. DeepAtomicCharge: a new graph convolutional network-based architecture for accurate prediction of atomic charges. Brief. Bioinform. 22:3bbaa183
    [Google Scholar]
  102. 102. 
    Artrith N, Morawietz T, Behler J. 2011. High-dimensional neural-network potentials for multicomponent systems: applications to zinc oxide. Phys. Rev. B 83:153101
    [Google Scholar]
  103. 103. 
    Morawietz T, Sharma V, Behler J. 2012. A neural network potential-energy surface for the water dimer based on environment-dependent atomic energies and charges. J. Chem. Phys. 136:064103
    [Google Scholar]
  104. 104. 
    Ewald PP. 1921. Die Berechnung optischer und elektrostatischer Gitterpotentiale. Ann. Phys. 64:253–87
    [Google Scholar]
  105. 105. 
    Grimme S. 2006. Semiempirical GGA-type density functional constructed with a long-range dispersion correction. J. Comput. Chem. 27:151787–99
    [Google Scholar]
  106. 106. 
    Yao K, Herr JE, Toth DW, Mckintyre R, Parkhill J 2018. The TensorMol-0.1 model chemistry: a neural network augmented with long-range physics. Chem. Sci. 9:2261–69
    [Google Scholar]
  107. 107. 
    Ghasemi SA, Hofstetter A, Saha S, Goedecker S. 2015. Interatomic potentials for ionic systems with density functional accuracy based on charge densities obtained by a neural network. Phys. Rev. B 92:045131
    [Google Scholar]
  108. 108. 
    Xie X, Persson KA, Small DW. 2020. Incorporating electronic information into machine learning potential energy surfaces via approaching the ground-state electronic energy as a function of atom-based electronic populations. J. Chem. Theory Comput. 16:4256–70
    [Google Scholar]
  109. 109. 
    Rappe AK, Goddard WA III. 1991. Charge equilibration for molecular dynamics simulations. J. Phys. Chem. 95:3358
    [Google Scholar]
  110. 110. 
    Bartók AP, Kondor R, Csányi G. 2013. On representing chemical environments. Phys. Rev. B 87:184115
    [Google Scholar]
  111. 111. 
    Kaduk B, Kowalczyk T, Voorhis TV. 2011. Constrained density functional theory. Chem. Rev. 112:321–70
    [Google Scholar]
  112. 112. 
    Devereux C, Smith JS, Huddleston KK, Barros K, Zubatyuk R et al. 2020. Extending the applicability of the ANI deep learning molecular potential to sulfur and halogens. J. Chem. Theory Comput. 16:4192–202
    [Google Scholar]
  113. 113. 
    Seung HS, Opper M, Sompolinsky H. 1992. Query by committee. Proceedings of the 5th Annual Workshop on Computational Learning Theory287–94 New York: ACM
    [Google Scholar]
  114. 114. 
    Artrith N, Behler J. 2012. High-dimensional neural network potentials for metal surfaces: a prototype study for copper. Phys. Rev. B 85:045439
    [Google Scholar]
  115. 115. 
    Podryabinkin EV, Shapeev AV. 2017. Active learning of linearly parametrized interatomic potentials. Comput. Mater. Sci. 140:171–80
    [Google Scholar]
  116. 116. 
    Loeffler TD, Manna S, Patra TK, Chan H, Narayanan B, Sankaranarayanan S. 2020. Active learning a neural network model for gold clusters & bulk from sparse first principles training data. ChemCatChem 12:4796–806
    [Google Scholar]
  117. 117. 
    Zhang L, Lin DY, Wang H, Car R, E W 2019. Active learning of uniformly accurate interatomic potentials for materials simulation. Phys. Rev. Mater. 3:023804
    [Google Scholar]
  118. 118. 
    Sivaraman G, Krishnamoorthy AN, Baur M, Holm C, Stan M et al. 2020. Machine-learned interatomic potentials by active learning: amorphous and liquid hafnium dioxide. NPJ Comput. Mater. 6:104
    [Google Scholar]
  119. 119. 
    Smith JS, Nebgen B, Lubbers N, Isayev O, Roitberg AE. 2018. Less is more: sampling chemical space with active learning. J. Chem. Phys. 148:241733
    [Google Scholar]
  120. 120. 
    Schran C, Behler J, Marx D. 2020. Automated fitting of neural network potentials at coupled cluster accuracy: protonated water clusters as testing ground. J. Chem. Theory Comput. 16:88–99
    [Google Scholar]
  121. 121. 
    Lin Q, Zhang L, Zhang Y, Jiang B. 2021. Searching configurations in uncertainty space: active learning of high-dimensional neural network reactive potentials. J. Chem. Theory Comput. 17:2691–701
    [Google Scholar]
  122. 122. 
    Bernstein N, Csányi G, Deringer VL 2019. De novo exploration and self-guided learning of potential-energy surfaces. NPJ Comput. Mater. 5:99
    [Google Scholar]
  123. 123. 
    Sun G, Sautet P. 2019. Toward fast and reliable potential energy surfaces for metallic Pt clusters by hierarchical delta neural networks. J. Chem. Theory Comput. 15:105614–27
    [Google Scholar]
  124. 124. 
    Ramakrishnan R, Dral PO, Rupp M, von Lilienfeld OA 2015. Big data meets quantum chemistry approximations: the delta-machine learning approach. J. Chem. Theory Comput. 11:2087–96
    [Google Scholar]
  125. 125. 
    Dral PO, Owens A, Dral A, Csányi G. 2020. Hierarchical machine learning of potential energy surfaces. J. Chem. Phys. 152:204110
    [Google Scholar]
  126. 126. 
    Unke OT, Chmiela S, Gastegger M, Schütt KT, Sauceda HE, Müller KR. 2021. SpookyNet: learning force fields with electronic degrees of freedom and nonlocal effects. arXiv:2105.00304 [physics.chem-ph]
  127. 127. 
    Eckhoff M, Behler J. 2021. High-dimensional neural network potentials for magnetic systems using spin-dependent atom-centered symmetry functions. arXiv:2104.14439 [physics.comp-ph]
  128. 128. 
    Gastegger M, Schütt KT, Müller KR. 2020. Machine learning of solvent effects on molecular spectra and reactions. arxiv:2010.14942 [physics.chem-ph]
  129. 129. 
    Li H, Collins C, Tanha M, Gordon GJ, Yaron DJ 2018. A density functional tight binding layer for deep learning of chemical Hamiltonians. J. Chem. Theory Comput. 14:5764–76
    [Google Scholar]
  130. 130. 
    Zubatyuk T, Nebgen B, Lubbers N, Smith JS, Zubatyuk R et al. 2019. Machine learned Hückel theory: interfacing physics and deep neural networks. arXiv:1909.12963v1 [cond-mat.dis-nn]
  131. 131. 
    Qiao Z, Welborn M, Anandkumar A, Manby FR, Miller TF III. 2020. OrbNet: deep learning for quantum chemistry using symmetry-adapted atomic-orbital features. J. Chem. Phys. 153:124111
    [Google Scholar]
  132. 132. 
    Pfau D, Spencer JS, Matthews AGDG, Foulkes WMC. 2020. Ab initio solution of the many-electron Schrödinger equation with deep neural networks. Phys. Rev. Res. 2:033429
    [Google Scholar]
  133. 133. 
    Hermann J, Schätzle Z, Noé F. 2020. Deep-neural-network solution of the electronic Schrödinger equation. Nat. Chem. 12:891–97
    [Google Scholar]
/content/journals/10.1146/annurev-physchem-082720-034254
Loading
/content/journals/10.1146/annurev-physchem-082720-034254
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error