1932

Abstract

Machine learning (ML) is transforming all areas of science. The complex and time-consuming calculations in molecular simulations are particularly suitable for an ML revolution and have already been profoundly affected by the application of existing ML methods. Here we review recent ML methods for molecular simulation, with particular focus on (deep) neural networks for the prediction of quantum-mechanical energies and forces, on coarse-grained molecular dynamics, on the extraction of free energy surfaces and kinetics, and on generative network approaches to sample molecular equilibrium structures and compute thermodynamics. To explain these methods and illustrate open methodological problems, we review some important principles of molecular physics and describe how they can be incorporated into ML structures. Finally, we identify and describe a list of open challenges for the interface between ML and molecular simulation.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-physchem-042018-052331
2020-04-20
2024-10-10
Loading full text...

Full text loading...

/deliver/fulltext/physchem/71/1/annurev-physchem-042018-052331.html?itemId=/content/journals/10.1146/annurev-physchem-042018-052331&mimeType=html&fmt=ahah

Literature Cited

  1. 1. 
    Dirac PAM 1929. Quantum mechanics of many-electron systems. Proc. R. Soc. A 123:714–33
    [Google Scholar]
  2. 2. 
    Müller KR, Mika S, Rätsch G, Tsuda K, Schölkopf B 2001. An introduction to kernel-based learning algorithms. IEEE Trans. Neural Netw. 12:181–201
    [Google Scholar]
  3. 3. 
    Vapnik VN 1999. An overview of statistical learning theory. IEEE Trans. Neural Netw. 10:988–99
    [Google Scholar]
  4. 4. 
    Bishop CM 2006. Pattern Recognition and Machine Learning Singapore: Springer
    [Google Scholar]
  5. 5. 
    LeCun Y, Bengio Y, Hinton G 2015. Deep learning. Nature 521:436–44
    [Google Scholar]
  6. 6. 
    Goodfellow I, Bengio Y, Courville A 2016. Deep Learning Cambridge, MA: MIT Press
    [Google Scholar]
  7. 7. 
    Behler J, Parrinello M 2007. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett. 98:146401
    [Google Scholar]
  8. 8. 
    Rupp M, Tkatchenko A, Müller KR, Lilienfeld OAV 2012. Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett. 108:058301
    [Google Scholar]
  9. 9. 
    Bartók AP, Payne MC, Kondor R, Csányi G 2010. Gaussian approximation potentials: the accuracy of quantum mechanics, without the electrons. Phys. Rev. Lett. 104:136403
    [Google Scholar]
  10. 10. 
    Chmiela S, Sauceda HE, Müller KR, Tkatchenko A 2018. Towards exact molecular dynamics simulations with machine-learned force fields. Nat. Commun. 9:3887
    [Google Scholar]
  11. 11. 
    Smith JS, Isayev O, Roitberg AE 2017a. ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci. 8:3192–203
    [Google Scholar]
  12. 12. 
    Smith JS, Nebgen BT, Zubatyuk R, Lubbers N, Devereux C et al. 2018. Outsmarting quantum chemistry through transfer learning. ChemRxiv 6744440. https://doi.org/10.26434/chemrxiv.6744440.v1
    [Crossref] [Google Scholar]
  13. 13. 
    Brockherde F, Vogt L, Li L, Tuckerman ME, Burke K, Müller KR 2017. Bypassing the Kohn–Sham equations with machine learning. Nat. Commun. 8:872
    [Google Scholar]
  14. 14. 
    Schütt KT, Arbabzadah F, Chmiela S, Müller KR, Tkatchenko A 2017. Quantum-chemical insights from deep tensor neural networks. Nat. Commun. 8:13890
    [Google Scholar]
  15. 15. 
    Bereau T, DiStasio RA Jr., Tkatchenko A, von Lilienfeld OA 2018. Non-covalent interactions across organic and biological subsets of chemical space: physics-based potentials parametrized from machine learning. J. Chem. Phys. 148:241706
    [Google Scholar]
  16. 16. 
    Welborn M, Cheng L, Miller TF III 2018. Transferability in machine learning for electronic structure via the molecular orbital basis. J. Chem. Theory Comput. 14:4772–79
    [Google Scholar]
  17. 17. 
    Cheng L, Welborn M, Christensen AS, Miller TF III 2019. A universal density matrix functional from molecular orbital-based machine learning: transferability across organic molecules. J. Chem. Phys. 150:131103
    [Google Scholar]
  18. 18. 
    John ST, Csányi G 2017. Many-body coarse-grained interactions using Gaussian approximation potentials. J. Phys. Chem. B 121:10934–49
    [Google Scholar]
  19. 19. 
    Zhang L, Han J, Wang H, Car R, Weinan E 2018. DeePCG: constructing coarse-grained models via deep neural networks. J. Chem. Phys. 149:034101
    [Google Scholar]
  20. 20. 
    Wang J, Olsson S, Wehmeyer C, Pérez A, Charron NE et al. 2019. Machine learning of coarse-grained molecular dynamics force fields. ACS Cent. Sci. 5:755–67
    [Google Scholar]
  21. 21. 
    Durumeric AEP, Voth GA 2019. Adversarial-residual-coarse-graining: applying machine learning theory to systematic molecular coarse-graining. J. Chem. Phys. 151:124110
    [Google Scholar]
  22. 22. 
    Wehmeyer C, Noé F 2018. Time-lagged autoencoders: deep learning of slow collective variables for molecular kinetics. J. Chem. Phys. 148:241703
    [Google Scholar]
  23. 23. 
    Hernández CX, Wayment-Steele HK, Sultan MM, Husic BE, Pande VS 2018. Variational encoding of complex dynamics. Phys. Rev. E 97:062412
    [Google Scholar]
  24. 24. 
    Mardt A, Pasquali L, Wu H, Noé F 2018. VAMPnets: deep learning of molecular kinetics. Nat. Commun. 9:5
    [Google Scholar]
  25. 25. 
    Ribeiro JML, Bravo P, Wang Y, Tiwary P 2018. Reweighted autoencoded variational Bayes for enhanced sampling (RAVE). J. Chem. Phys. 149:072301
    [Google Scholar]
  26. 26. 
    Chen W, Sidky H, Ferguson AL 2019. Nonlinear discovery of slow molecular modes using state-free reversible VAMPnets. J. Chem. Phys. 150:214114
    [Google Scholar]
  27. 27. 
    Jung H, Covino R, Hummer G 2019. Artificial intelligence assists discovery of reaction coordinates and mechanisms from molecular dynamics simulations. arXiv:1901.04595 [physics.chem-ph]
    [Google Scholar]
  28. 28. 
    Stecher T, Bernstein N, Csányi G 2014. Free energy surface reconstruction from umbrella samples using Gaussian process regression. J. Chem. Theory Comput. 10:4079–97
    [Google Scholar]
  29. 29. 
    Mones L, Bernstein N, Csányi G 2016. Exploration, sampling, and reconstruction of free energy surfaces with Gaussian process regression. J. Chem. Theory Comput. 12:5100–10
    [Google Scholar]
  30. 30. 
    Schneider E, Dai L, Topper RQ, Drechsel-Grau C, Tuckerman ME 2017. Stochastic neural network approach for learning high-dimensional free energy surfaces. Phys. Rev. Lett. 119:150601
    [Google Scholar]
  31. 31. 
    Wu H, Mardt A, Pasquali L, Noé F 2018a. Deep generative Markov state models. Adv. Neural Inf. Process. Syst. 31:3975–84
    [Google Scholar]
  32. 32. 
    Olsson S, Noé F 2019. Dynamic graphical models of molecular kinetics. PNAS 116:15001–6
    [Google Scholar]
  33. 33. 
    Valsson O, Parrinello M 2014. Variational approach to enhanced sampling and free energy calculations. Phys. Rev. Lett. 113:090601
    [Google Scholar]
  34. 34. 
    Bonati L, Zhang YY, Parrinello M 2019. Neural networks-based variationally enhanced sampling. PNAS 116:17641–47
    [Google Scholar]
  35. 35. 
    Zhang J, Yang YI, Noé F 2019. Targeted adversarial learning optimized sampling. J. Phys. Chem. Lett. 10:5791–97
    [Google Scholar]
  36. 36. 
    McCarty J, Parrinello M 2017. A variational conformational dynamics approach to the selection of collective variables in metadynamics. J. Chem. Phys. 147:204109
    [Google Scholar]
  37. 37. 
    Sultan MM, Pande VS 2017. tICA-metadynamics: accelerating metadynamics by using kinetically selected collective variables. J. Chem. Theory Comput. 13:2440–47
    [Google Scholar]
  38. 38. 
    Doerr S, Fabritiis GD 2014. On-the-fly learning and sampling of ligand binding by high-throughput molecular simulations. J. Chem. Theory Comput. 10:2064–69
    [Google Scholar]
  39. 39. 
    Zimmerman MI, Bowman GR 2015. FAST conformational searches by balancing exploration/exploitation trade-offs. J. Chem. Theory Comput. 11:5747–57
    [Google Scholar]
  40. 40. 
    Plattner N, Doerr S, Fabritiis GD, Noé F 2017. Protein-protein association and binding mechanism resolved in atomic detail. Nat. Chem. 9:1005–11
    [Google Scholar]
  41. 41. 
    Noé F, Olsson S, Köhler J, Wu H 2019. Boltzmann generators—sampling equilibrium states of many-body systems with deep learning. Science 365:eaaw1147
    [Google Scholar]
  42. 42. 
    Jiménez J, Skalic M, Martinez-Rosell G, Fabritiis GD 2018. KDEEP: protein–ligand absolute binding affinity prediction via 3D-convolutional neural networks. J. Chem. Inf. Model. 58:287–96
    [Google Scholar]
  43. 43. 
    Skalic M, Varela-Rial A, Jiménez J, Martínez-Rosell G, Fabritiis GD 2018. LigVoxel: inpainting binding pockets using 3D-convolutional neural networks. Bioinformatics 35:243–50
    [Google Scholar]
  44. 44. 
    Wu Z, Ramsundar B, Feinberg EN, Gomes J, Geniesse C et al. 2018b. MoleculeNet: a benchmark for molecular machine learning. Chem. Sci. 9:513–30
    [Google Scholar]
  45. 45. 
    Feinberg EN, Sur D, Wu Z, Husic BE, Mai H et al. 2018. PotentialNet for molecular property prediction. ACS Cent. Sci. 4:1520–30
    [Google Scholar]
  46. 46. 
    Gómez-Bombarelli R, Wei JN, Duvenaud D, Hernández-Lobato JM, Sánchez-Lengeling B et al. 2018. Automatic chemical design using a data-driven continuous representation of molecules. ACS Cent. Sci. 4:268–76
    [Google Scholar]
  47. 47. 
    Popova M, Isayev O, Tropsha A 2018. Deep reinforcement learning for de novo drug design. Sci. Adv. 4:eaap7885
    [Google Scholar]
  48. 48. 
    Winter R, Montanari F, Steffen A, Briem H, Noé F, Clevert DA 2019a. Efficient multi-objective molecular optimization in a continuous latent space. Chem. Sci. 10:8016–24
    [Google Scholar]
  49. 49. 
    Winter R, Montanari F, Noé F, Clevert DA 2019b. Learning continuous and data-driven molecular descriptors by translating equivalent chemical representations. Chem. Sci. 10:1692–701
    [Google Scholar]
  50. 50. 
    Butler KT, Davies DW, Cartwright H, Isayev O, Walsh A 2018. Machine learning for molecular and materials science. Nature 559:547–55
    [Google Scholar]
  51. 51. 
    Sanchez-Lengeling B, Aspuru-Guzik A 2018. Inverse molecular design using machine learning: generative models for matter engineering. Science 361:360–65
    [Google Scholar]
  52. 52. 
    Lindorff-Larsen K, Maragakis P, Piana S, Eastwood MP, Dror RO, Shaw DE 2012. Systematic validation of protein force fields against experimental data. PLOS ONE 7:e32131
    [Google Scholar]
  53. 53. 
    Nerenberg PS, Jo B, So C, Tripathy A, Head-Gordon T 2012. Optimizing solute-water van der Waals interactions to reproduce solvation free energies. J. Phys. Chem. B 116:4524–34
    [Google Scholar]
  54. 54. 
    Huang J, Rauscher S, Nawrocki G, Ran T, Feig M et al. 2016. CHARMM36m: an improved force field for folded and intrinsically disordered proteins. Nat. Methods 14:71–73
    [Google Scholar]
  55. 55. 
    Robustelli P, Piana S, Shaw DE 2018. Developing a molecular dynamics force field for both folded and disordered protein states. PNAS 115:E4758–66
    [Google Scholar]
  56. 56. 
    Behler J 2016. Perspective: machine learning potentials for atomistic simulations. J. Chem. Phys. 145:170901
    [Google Scholar]
  57. 57. 
    Li Z, Kermode JR, Vita AD 2015. Molecular dynamics with on-the-fly machine learning of quantum-mechanical forces. Phys. Rev. Lett. 114:96405
    [Google Scholar]
  58. 58. 
    Schütt KT, Sauceda HE, Kindermans PJ, Tkatchenko A, Müller KR 2018. SchNet—a deep learning architecture for molecules and materials. J. Chem. Phys. 148:241722
    [Google Scholar]
  59. 59. 
    Gastegger M, Behler J, Marquetand P 2017. Machine learning molecular dynamics for the simulation of infrared spectra. Chem. Sci. 8:6924–35
    [Google Scholar]
  60. 60. 
    Dral PO, Owens A, Yurchenko SN, Thiel W 2017. Structure-based sampling and self-correcting machine learning for accurate calculations of potential energy surfaces and vibrational levels. J. Chem. Phys. 146:244108
    [Google Scholar]
  61. 61. 
    Chmiela S, Tkatchenko A, Sauceda HE, Poltavsky I, Schütt KT, Müller KR 2017. Machine learning of accurate energy-conserving molecular force fields. Sci. Adv. 3:e1603015
    [Google Scholar]
  62. 62. 
    Han J, Zhang L, Car R, Weinan E 2018. Deep Potential: a general representation of a many-body potential energy surface. Phys. Rev. Lett. 120:143001
    [Google Scholar]
  63. 63. 
    Noé F, Clementi C 2017. Collective variables for the study of long-time kinetics from molecular trajectories: theory and methods. Curr. Opin. Struct. Biol. 43:141–47
    [Google Scholar]
  64. 64. 
    Galvelis R, Sugita Y 2017. Neural network and nearest neighbor algorithms for enhancing sampling of molecular dynamics. J. Chem. Theory Comput. 13:2489–500
    [Google Scholar]
  65. 65. 
    Sidky H, Whitmer JK 2018. Learning free energy landscapes using artificial neural networks. J. Chem. Phys. 148:104111
    [Google Scholar]
  66. 66. 
    Ribeiro JML, Tiwary P 2018. Toward achieving efficient and accurate ligand-protein unbinding with deep learning and molecular dynamics through RAVE. J. Chem. Theory Comput. 15:708–19
    [Google Scholar]
  67. 67. 
    Chen W, Ferguson AL 2018. Molecular enhanced sampling with autoencoders: on-the-fly collective variable discovery and accelerated free energy landscape exploration. J. Comput. Chem. 39:2079–102
    [Google Scholar]
  68. 68. 
    Sultan MM, Wayment-Steele HK, Pande VS 2018. Transferable neural networks for enhanced sampling of protein dynamics. J. Chem. Theory Comput. 14:1887–94
    [Google Scholar]
  69. 69. 
    Guo AZ, Sevgen E, Sidky H, Whitmer JK, Hubbell JA, de Pablo JJ 2018. Adaptive enhanced sampling by force-biasing using neural networks. J. Chem. Phys. 148:134108
    [Google Scholar]
  70. 70. 
    Noid WG, Chu JW, Ayton GS, Krishna V, Izvekov S et al. 2008. The multiscale coarse-graining method. I. A rigorous bridge between atomistic and coarse-grained models. J. Chem. Phys. 128:244114
    [Google Scholar]
  71. 71. 
    Noid WG 2013. Perspective: coarse-grained models for biomolecular systems. J. Chem. Phys. 139:090901
    [Google Scholar]
  72. 72. 
    Ciccotti G, Lelièvre T, Vanden-Eijnden E 2008. Projection of diffusions on submanifolds: application to mean force computation. Commun. Pure Appl. Math. 61:371–408
    [Google Scholar]
  73. 73. 
    Clementi C 2008. Coarse-grained models of protein folding: toy-models or predictive tools?. Curr. Opin. Struct. Biol. 18:10–15
    [Google Scholar]
  74. 74. 
    Boninsegna L, Banisch R, Clementi C 2018. A data-driven perspective on the hierarchical assembly of molecular structures. J. Chem. Theory Comput. 14:453–60
    [Google Scholar]
  75. 75. 
    Wang W, Gómez-Bombarelli R 2018. Variational coarse-graining for molecular dynamics. arXiv:1812.02706 [physics.chem-ph]
    [Google Scholar]
  76. 76. 
    Sarich M, Noé F, Schütte C 2010. On the approximation quality of Markov state models. Multiscale Model. Simul. 8:1154–77
    [Google Scholar]
  77. 77. 
    Wu H, Noé F 2017. Variational approach for learning Markov processes from time series data. arXiv:1707.04659 [stat.ML]
    [Google Scholar]
  78. 78. 
    Buchete NV, Hummer G 2008. Coarse master equations for peptide folding dynamics. J. Phys. Chem. B 112:6057–69
    [Google Scholar]
  79. 79. 
    Noé F, Doose S, Daidone I, Löllmann M, Chodera JD et al. 2011. Dynamical fingerprints for probing individual relaxation processes in biomolecular dynamics with simulations and kinetic experiments. PNAS 108:4822–27
    [Google Scholar]
  80. 80. 
    Schütte C, Fischer A, Huisinga W, Deuflhard P 1999. A direct approach to conformational dynamics based on hybrid Monte Carlo. J. Comput. Phys. 151:146–68
    [Google Scholar]
  81. 81. 
    Swope WC, Pitera JW, Suits F 2004. Describing protein folding kinetics by molecular dynamics simulations: 1. Theory. J. Phys. Chem. B 108:6571–81
    [Google Scholar]
  82. 82. 
    Noé F, Horenko I, Schütte C, Smith JC 2007. Hierarchical analysis of conformational dynamics in biomolecules: transition networks of metastable states. J. Chem. Phys. 126:155102
    [Google Scholar]
  83. 83. 
    Chodera JD, Dill KA, Singhal N, Pande VS, Swope WC, Pitera JW 2007. Automatic discovery of metastable states for the construction of Markov models of macromolecular conformational dynamics. J. Chem. Phys. 126:155101
    [Google Scholar]
  84. 84. 
    Prinz JH, Wu H, Sarich M, Keller BG, Senne M et al. 2011. Markov models of molecular kinetics: generation and validation. J. Chem. Phys. 134:174105
    [Google Scholar]
  85. 85. 
    Mezić I 2005. Spectral properties of dynamical systems, model reduction and decompositions. Nonlinear Dyn. 41:309–25
    [Google Scholar]
  86. 86. 
    Schmid PJ 2010. Dynamic mode decomposition of numerical and experimental data. J. Fluid Mech. 656:5–28
    [Google Scholar]
  87. 87. 
    Tu JH, Rowley CW, Luchtenburg DM, Brunton SL, Kutz JN 2014. On dynamic mode decomposition: theory and applications. J. Comput. Dyn. 1:391–421
    [Google Scholar]
  88. 88. 
    Wu H, Nüske F, Paul F, Klus S, Koltai P, Noé F 2017. Variational Koopman models: slow collective variables and molecular kinetics from short off-equilibrium simulations. J. Chem. Phys. 146:154104
    [Google Scholar]
  89. 89. 
    Noé F, Nüske F 2013. A variational approach to modeling slow processes in stochastic dynamical systems. Multiscale Model. Simul. 11:635–55
    [Google Scholar]
  90. 90. 
    Noé F 2018. Machine learning for molecular dynamics on long timescales. arXiv:1812.07669 [physics.chem-ph]
    [Google Scholar]
  91. 91. 
    Smolensky P 1986. Information processing in dynamical systems: foundations of harmony theory. Parallel Distributed Processing: Explorations in the Microstructure of Cognition 1: Foundations DE Rumelhart, JL McLelland194–281 Cambridge, MA: MIT Press
    [Google Scholar]
  92. 92. 
    Kingma DP, Welling M 2014. Auto-encoding variational Bayes. arXiv:1312.6114 [stat.ML]
    [Google Scholar]
  93. 93. 
    Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D et al. 2014. Generative adversarial networks. Adv. Neural Inf. Process. Syst. 27:2672–80
    [Google Scholar]
  94. 94. 
    Dinh L, Krueger D, Bengio Y 2015. NICE: Non-linear Independent Components Estimation. arXiv:1410.8516 [cs.LG]
    [Google Scholar]
  95. 95. 
    Tabak EG, Vanden-Eijnden E 2010. Density estimation by dual ascent of the log-likelihood. Commun. Math. Sci. 8:217–33
    [Google Scholar]
  96. 96. 
    Karras T, Aila T, Laine S, Lehtinen J 2018. Progressive growing of GANs for improved quality, stability, and variation. arXiv:1710.10196 [cs.NE]
    [Google Scholar]
  97. 97. 
    Kingma DP, Dhariwal P 2018. Glow: generative flow with invertible 1×1 convolutions. arXiv:1807:03039 [stat.ML]
    [Google Scholar]
  98. 98. 
    van den Oord A, Li Y, Babuschkin I, Simonyan K, Vinyals O et al. 2018. Parallel WaveNet: fast high-fidelity speech synthesis. Proc. Mach. Learn. Res. 80:3918–26
    [Google Scholar]
  99. 99. 
    LeCun Y, Bottou L, Bengio Y, Haffner P 1998a. Gradient-based learning applied to document recognition. Proc. IEEE 86:2278–324
    [Google Scholar]
  100. 100. 
    Schütt K, Kindermans PJ, Sauceda HE, Chmiela S, Tkatchenko A, Müller KR 2017. SchNet: a continuous-filter convolutional neural network for modeling quantum interactions. Adv. Neural Inf. Process. Syst. 30:991–1001
    [Google Scholar]
  101. 101. 
    Zaheer M, Kottur S, Ravanbakhsh S, Poczos B, Salakhutdinov RR, Smola AJ 2017. Deep sets. Adv. Neural Inf. Process. Syst. 30:3391–401
    [Google Scholar]
  102. 102. 
    Ercolessi F, Tosatti E, Parrinello M 1986. Au (100) surface reconstruction. Phys. Rev. Lett. 57:719
    [Google Scholar]
  103. 103. 
    Tersoff J 1986. New empirical model for the structural properties of silicon. Phys. Rev. Lett. 56:632
    [Google Scholar]
  104. 104. 
    Ferrante J, Smith JR, Rose JH 1983. Diatomic molecules and metallic adhesion, cohesion, and chemisorption: a single binding-energy relation. Phys. Rev. Lett. 50:1385
    [Google Scholar]
  105. 105. 
    Abell GC 1985. Empirical chemical pseudopotential theory of molecular and metallic bonding. Phys. Rev. B 31:6184
    [Google Scholar]
  106. 106. 
    Kuhn HW 1955. The Hungarian method for the assignment problem. Nav. Res. Logist. Q. 2:83–97
    [Google Scholar]
  107. 107. 
    Reinhard F, Grubmüller H 2007. Estimation of absolute solvent and solvation shell entropies via permutation reduction. J. Chem. Phys. 126:014102
    [Google Scholar]
  108. 108. 
    Smith JS, Isayev O, Roitberg AE 2017b. ANI-1, a data set of 20 million calculated off-equilibrium conformations for organic molecules. Sci. Data 4:170193
    [Google Scholar]
  109. 109. 
    Schütt KT, Kessel P, Gastegger M, Nicoli KA, Tkatchenko A, Müller KR 2019. SchNetPack: a deep learning toolbox for atomistic systems. J. Chem. Theory Comput. 15:448–55
    [Google Scholar]
  110. 110. 
    Vapnik V 1995. The Nature of Statistical Learning Theory New York: Springer
    [Google Scholar]
  111. 111. 
    Hansen K, Montavon G, Biegler F, Fazli S, Rupp M et al. 2013. Assessment and validation of machine learning methods for predicting molecular atomization energies. J. Chem. Theory Comput. 9:3404–19
    [Google Scholar]
  112. 112. 
    Hansen K, Biegler F, Ramakrishnan R, Pronobis W, Von Lilienfeld OA et al. 2015. Machine learning predictions of molecular properties: accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett. 6:2326–31
    [Google Scholar]
  113. 113. 
    Bartók AP, Kondor R, Csányi G 2013. On representing chemical environments. Phys. Rev. B 87:184115
    [Google Scholar]
  114. 114. 
    Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J 2013. Distributed representations of words and phrases and their compositionality. Adv. Neural Inf. Process. Syst. 26:3111–19
    [Google Scholar]
  115. 115. 
    Braun ML, Buhmann JM, Müller KR 2008. On relevant dimensions in kernel feature spaces. J. Mach. Learn. Res. 9:1875–906
    [Google Scholar]
  116. 116. 
    LeCun Y, Boser B, Denker JS, Henderson D, Howard RE et al. 1989. Backpropagation applied to handwritten zip code recognition. Neural Comput. 1:541–51
    [Google Scholar]
  117. 117. 
    LeCun Y, Bottou L, Orr GB, Müller KR 1998b. Efficient backprop. Neural Networks: Tricks of the Trade JB Orr, KR Müller9–53 Berlin: Springer
    [Google Scholar]
  118. 118. 
    Perez-Hernandez G, Paul F, Giorgino T, De Fabritiis G, Noé F 2013. Identification of slow molecular order parameters for Markov model construction. J. Chem. Phys. 139:015102
    [Google Scholar]
  119. 119. 
    Koltai P, Ciccotti G, Schütte C 2016. On metastability and Markov state models for non-stationary molecular dynamics. J. Chem. Phys. 145:174103
    [Google Scholar]
  120. 120. 
    Koltai P, Wu H, Noé F, Schütte C 2018. Optimal data-driven estimation of generalized Markov state models for non-equilibrium dynamics. Computation 6:22
    [Google Scholar]
  121. 121. 
    Li Q, Dietrich F, Bollt EM, Kevrekidis IG 2017. Extended dynamic mode decomposition with dictionary learning: a data-driven adaptive spectral decomposition of the Koopman operator. Chaos 27:103111
    [Google Scholar]
  122. 122. 
    Bießmann F, Meinecke FC, Gretton A, Rauch A, Rainer G et al. 2010. Temporal kernel CCA and its application in multimodal neuronal data analysis. Mach. Learn. 79:5–27
    [Google Scholar]
  123. 123. 
    McGibbon RT, Pande VS 2015. Variational cross-validation of slow dynamical modes in molecular kinetics. J. Chem. Phys. 142:124105
    [Google Scholar]
  124. 124. 
    Husic BE, McGibbon RT, Sultan MM, Pande VS 2016. Optimized parameter selection reveals trends in Markov state models for protein folding. J. Chem. Phys. 145:194103
    [Google Scholar]
  125. 125. 
    Scherer MK, Husic BE, Hoffmann M, Paul F, Wu H, Noé F 2019. Variational selection of features for molecular kinetics. J. Chem. Phys. 150:194108
    [Google Scholar]
  126. 126. 
    Noé F, Schütte C, Vanden-Eijnden E, Reich L, Weikl TR 2009. Constructing the full ensemble of folding pathways from short off-equilibrium simulations. PNAS 106:19011–16
    [Google Scholar]
  127. 127. 
    Deuflhard P, Weber M 2005. Robust Perron cluster analysis in conformation dynamics. Linear Algebra Appl. 398:61–84
    [Google Scholar]
  128. 128. 
    Röblitz S, Weber M 2013. Fuzzy spectral clustering by PCCA+: application to Markov state models and data classification. Adv. Data Anal. Classif. 7:147–79
    [Google Scholar]
  129. 129. 
    Dinh L, Sohl-Dickstein J, Bengio S 2016. Density estimation using real NVP. arXiv:1605.08803 [cs.LG]
    [Google Scholar]
  130. 130. 
    Rezende DJ, Mohamed S 2015. Variational inference with normalizing flows. arXiv:1505.05770 [stat.ML]
    [Google Scholar]
  131. 131. 
    Grathwohl W, Chen RTQ, Bettencourt J, Sutskever I, Duvenaud D 2018. FFJORD: free-form continuous dynamics for scalable reversible generative models. arXiv:1810.01367 [cs.LG]
    [Google Scholar]
  132. 132. 
    Sauceda HE, Chmiela S, Poltavsky I, Müller KR, Tkatchenko A 2019. Molecular force fields with gradient-domain machine learning: construction and application to dynamics of small molecules with coupled cluster forces. J. Chem. Phys. 150:114102
    [Google Scholar]
  133. 133. 
    Chmiela S, Sauceda HE, Poltavsky I, Müller KR, Tkatchenko A 2019. sGDML: constructing accurate and data efficient molecular force fields using machine learning. Comput. Phys. Commun. 240:38
    [Google Scholar]
  134. 134. 
    Darden T, Perera L, Li L, Pedersen L 1999. New tricks for modelers from the crystallography toolkit: the particle mesh Ewald algorithm and its use in nucleic acid simulations. Structure 7:55–60
    [Google Scholar]
  135. 135. 
    Unke OT, Meuwli M 2019. PhysNet: a neural network for predicting energies, forces, dipole moments and partial charges. J. Chem. Theory Comput. 15:3678–93
    [Google Scholar]
  136. 136. 
    Nebgen B, Lubbers N, Smith JS, Sifain AE, Lokhov A et al. 2018. Transferable dynamic molecular charge assignment using deep neural networks. J. Chem. Theory Comput. 14:4687–98
    [Google Scholar]
  137. 137. 
    Ambrosetti A, Ferri N, DiStasio RA Jr., Tkatchenko A 2016. Wavelike charge density fluctuations and van der Waals interactions at the nanoscale. Science 351:1171–76
    [Google Scholar]
  138. 138. 
    Hermann J, DiStasio AR Jr., Tkatchenko A 2017. First-principles models for van der Waals interactions in molecules and materials: concepts, theory, and applications. Chem. Rev. 117:4714–58
    [Google Scholar]
  139. 139. 
    Stoehr M, Van Voorhis T, Tkatchenko A 2019. Theory and practice of modeling van der Waals interactions in electronic-structure calculations. Chem. Soc. Rev. 48:4118–54
    [Google Scholar]
  140. 140. 
    Mullinax JW, Noid WG 2009. Extended ensemble approach for deriving transferable coarse-grained potentials. J. Chem. Phys. 131:104110
    [Google Scholar]
  141. 141. 
    Thorpe IF, Goldenberg DP, Voth GA 2011. Exploration of transferability in multiscale coarse-grained peptide models. J. Phys. Chem. B 115:11911–26
    [Google Scholar]
  142. 142. 
    Davtyan A, Voth GA, Andersen HC 2016. Dynamic force matching: construction of dynamic coarse-grained models with realistic short time dynamics and accurate long time dynamics. J. Chem. Phys. 145:224107
    [Google Scholar]
  143. 143. 
    Nüske F, Boninsegna L, Clementi C 2019. Coarse-graining molecular systems by spectral matching. J. Chem. Phys. 151:044116
    [Google Scholar]
  144. 144. 
    Bereau T, Rudzinski JF 2018. Accurate structure-based coarse graining leads to consistent barrier-crossing dynamics. Phys. Rev. Lett. 121:256002
    [Google Scholar]
  145. 145. 
    Bach S, Binder A, Montavon G, Klauschen F, Müller KR, Samek W 2015. On pixel-wise explanations for non-linear classifier decisions by layer-wise relevance propagation. PLOS ONE 10:e0130140
    [Google Scholar]
  146. 146. 
    Montavon G, Samek W, Müller KR 2018. Methods for interpreting and understanding deep neural networks. Digit. Signal Process. 73:1–15
    [Google Scholar]
  147. 147. 
    Lapuschkin S, Wäldchen S, Binder A, Montavon G, Samek W, Müller KR 2019. Unmasking Clever Hans predictors and assessing what machines really learn. Nat. Commun. 10:1096
    [Google Scholar]
  148. 148. 
    Samek W, Montavon G, Vedaldi A, Hansen LK, Müller KReds 2019. Explainable AI: Interpreting, Explaining and Visualizing Deep Learning Cham, Switz.: Springer
    [Google Scholar]
/content/journals/10.1146/annurev-physchem-042018-052331
Loading
/content/journals/10.1146/annurev-physchem-042018-052331
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error