1932

Abstract

Molecular dynamics (MD) enables the study of physical systems with excellent spatiotemporal resolution but suffers from severe timescale limitations. To address this, enhanced sampling methods have been developed to improve the exploration of configurational space. However, implementing these methods is challenging and requires domain expertise. In recent years, integration of machine learning (ML) techniques into different domains has shown promise, prompting their adoption in enhanced sampling as well. Although ML is often employed in various fields primarily due to its data-driven nature, its integration with enhanced sampling is more natural with many common underlying synergies. This review explores the merging of ML and enhanced MD by presenting different shared viewpoints. It offers a comprehensive overview of this rapidly evolving field, which can be difficult to stay updated on. We highlight successful strategies such as dimensionality reduction, reinforcement learning, and flow-based methods. Finally, we discuss open problems at the exciting ML-enhanced MD interface.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-physchem-083122-125941
2024-06-28
2024-12-04
Loading full text...

Full text loading...

/deliver/fulltext/physchem/75/1/annurev-physchem-083122-125941.html?itemId=/content/journals/10.1146/annurev-physchem-083122-125941&mimeType=html&fmt=ahah

Literature Cited

  1. 1.
    Frenkel D, Smit B. 2002.. Understanding molecular simulation: from algorithms to applications. San Diego, CA:: Academic
    [Google Scholar]
  2. 2.
    Hollingsworth SA, Dror RO. 2018.. Molecular dynamics simulation for all. . Neuron 99:(6):112943
    [Crossref] [Google Scholar]
  3. 3.
    Lindorff-Larsen K, Piana S, Dror RO, Shaw DE. 2011.. How fast-folding proteins fold. . Science 334:(6055):51720
    [Crossref] [Google Scholar]
  4. 4.
    Tiwary P, Mondal J, Berne BJ. 2017.. How and when does an anticancer drug leave its binding site?. Sci. Adv. 3:(5):e1700014
    [Crossref] [Google Scholar]
  5. 5.
    Tiwary P, Limongelli V, Salvalaglio M, Parrinello M. 2015.. Kinetics of protein–ligand unbinding: predicting pathways, rates, and rate-limiting steps. . PNAS 112:(5):E38691
    [Crossref] [Google Scholar]
  6. 6.
    Shekhar M, Smith Z, Seeliger MA, Tiwary P. 2022.. Protein flexibility and dissociation pathway differentiation can explain onset of resistance mutations in kinases. . Angew. Chem. Int. Ed. 61:(28):e202200983
    [Crossref] [Google Scholar]
  7. 7.
    Tsai ST, Smith Z, Tiwary P. 2019.. Reaction coordinates and rate constants for liquid droplet nucleation: quantifying the interplay between driving force and memory. . J. Chem. Phys. 151:(15):154106
    [Crossref] [Google Scholar]
  8. 8.
    Zou Z, Tsai ST, Tiwary P. 2021.. Toward automated sampling of polymorph nucleation and free energies with the SGOOP and metadynamics. . J. Phys. Chem. B 125:(47):1304956
    [Crossref] [Google Scholar]
  9. 9.
    Zou Z, Beyerle ER, Tsai ST, Tiwary P. 2023.. Driving and characterizing nucleation of urea and glycine polymorphs in water. . PNAS 120:(7):e2216099120
    [Crossref] [Google Scholar]
  10. 10.
    Friedrichs MS, Eastman P, Vaidyanathan V, Houston M, Legrand S, et al. 2009.. Accelerating molecular dynamic simulation on graphics processing units. . J. Comput. Chem. 30:(6):86472
    [Crossref] [Google Scholar]
  11. 11.
    Heck GS, Pintro VO, Pereira RR, Levin NMB, de Azevedo WF, et al. 2017.. Supervised machine learning methods applied to predict ligand-binding affinity. . Curr. Med. Chem. 24:(23):245970
    [Crossref] [Google Scholar]
  12. 12.
    Olson RS, Cava WL, Mustahsan Z, Varik A, Moore JH. 2018.. Data-driven advice for applying machine learning to bioinformatics problems. . In Pacific Symposium on Biocomputing 2018: Proceedings of the Pacific Symposium, ed. RB Altman, AK Dunker, L Hunter, MD Ritchie, TA Murray, TE Klien , pp. 192203. Singapore:: World Sci.
    [Google Scholar]
  13. 13.
    Wei J, Chu X, Sun XY, Xu K, Deng HX, et al. 2019.. Machine learning in materials science. . InfoMat 1:(3):33858
    [Crossref] [Google Scholar]
  14. 14.
    Wang Y, Ribeiro JML, Tiwary P. 2020.. Machine learning approaches for analyzing and enhancing molecular dynamics simulations. . Curr. Opin. Struct. Biol. 61::13945
    [Crossref] [Google Scholar]
  15. 15.
    Sidky H, Chen W, Ferguson AL. 2020.. Machine learning for collective variable discovery and enhanced sampling in biomolecular simulation. . Mol. Phys. 118:(5):e1737742
    [Crossref] [Google Scholar]
  16. 16.
    Noé F, Tkatchenko A, Müller KR, Clementi C. 2020.. Machine learning for molecular simulation. . Annu. Rev. Phys. Chem. 71::36190
    [Crossref] [Google Scholar]
  17. 17.
    Rydzewski J, Chen M, Valsson O. 2023.. Manifold learning in atomistic simulations: a conceptual review. . Mach. Learn. 4:031001
    [Google Scholar]
  18. 18.
    Tiwary P, Parrinello M. 2015.. A time-independent free energy estimator for metadynamics. . J. Phys. Chem. B 119:(3):73642
    [Crossref] [Google Scholar]
  19. 19.
    Invernizzi M, Parrinello M. 2020.. Rethinking metadynamics: from bias potentials to probability distributions. . J. Phys. Chem. Lett. 11:(7):273136
    [Crossref] [Google Scholar]
  20. 20.
    Piana S, Laio A. 2008.. Advillin folding takes place on a hypersurface of small dimensionality. . Phys. Rev. Lett. 101:(20):208101
    [Crossref] [Google Scholar]
  21. 21.
    Chong LT, Saglam AS, Zuckerman DM. 2017.. Path-sampling strategies for simulating rare events in biomolecular systems. . Curr. Opin. Struct. Biol. 43::8894
    [Crossref] [Google Scholar]
  22. 22.
    Husic BE, Pande VS. 2018.. Markov state models: from an art to a science. . J. Am. Chem. Soc. 140:(7):238696
    [Crossref] [Google Scholar]
  23. 23.
    Hruska E, Abella JR, Nüske F, Kavraki LE, Clementi C. 2018.. Quantitative comparison of adaptive sampling methods for protein dynamics. . J. Chem. Phys. 149:(24):244119
    [Crossref] [Google Scholar]
  24. 24.
    Zimmerman MI, Porter JR, Sun X, Silva RR, Bowman GR. 2018.. Choice of adaptive sampling strategy impacts state discovery, transition probabilities, and the apparent mechanism of conformational changes. . J. Chem. Theory Comput. 14:(11):545975
    [Crossref] [Google Scholar]
  25. 25.
    Hénin J, Lelièvre T, Shirts MR, Valsson O, Delemotte L. 2022.. Enhanced sampling methods for molecular dynamics simulations [article v1.0]. . Living J. Comput. Mol. Sci. 4:(1):1583
    [Crossref] [Google Scholar]
  26. 26.
    Mehdi S, Wang D, Pant S, Tiwary P. 2022.. Accelerating all-atom simulations and gaining mechanistic understanding of biophysical systems through state predictive information bottleneck. . J. Chem. Theory Comput. 18:(5):323138
    [Crossref] [Google Scholar]
  27. 27.
    van der Maaten L, Hinton G. 2008.. Visualizing data using t-SNE. . J. Mach. Learn. Res. 9:(11):2579605
    [Google Scholar]
  28. 28.
    Sultan MM, Pande VS. 2017.. tICA-metadynamics: accelerating metadynamics by using kinetically selected collective variables. . J. Chem. Theory Comput. 13:(6):244047
    [Crossref] [Google Scholar]
  29. 29.
    Ribeiro JML, Bravo P, Wang Y, Tiwary P. 2018.. Reweighted autoencoded variational Bayes for enhanced sampling (RAVE). . J. Chem. Phys. 149:(7):072301
    [Crossref] [Google Scholar]
  30. 30.
    Ma A, Dinner AR. 2005.. Automatic method for identifying reaction coordinates in complex systems. . J. Phys. Chem. B 109:(14):676979
    [Crossref] [Google Scholar]
  31. 31.
    Peters B, Trout BL. 2006.. Obtaining reaction coordinates by likelihood maximization. . J. Chem. Phys. 125:(5):054108
    [Crossref] [Google Scholar]
  32. 32.
    Ravindra P, Smith Z, Tiwary P. 2020.. Automatic mutual information noise omission (AMINO): generating order parameters for molecular systems. . Mol. Syst. Des. Eng. 5:(1):33948
    [Crossref] [Google Scholar]
  33. 33.
    Berger T. 2003.. Rate-distortion theory. . In Wiley Encyclopedia of Telecommunications, ed. JG Proakis. https://doi.org/10.1002/0471219282.eot142
    [Google Scholar]
  34. 34.
    Diez G, Nagel D, Stock G. 2022.. Correlation-based feature selection to identify functional dynamics in proteins. . J. Chem. Theory Comput. 18:(8):507988
    [Crossref] [Google Scholar]
  35. 35.
    Hooft F, Pérez de Alba Ortíz A, Ensing B. 2021.. Discovering collective variables of molecular transitions via genetic algorithms and neural networks. . J. Chem. Theory Comput. 17:(4):2294306
    [Crossref] [Google Scholar]
  36. 36.
    Noé F, Nuske F. 2013.. A variational approach to modeling slow processes in stochastic dynamical systems. . Multiscale Model. Simul. 11:(2):63555
    [Crossref] [Google Scholar]
  37. 37.
    Nuske F, Keller BG, Pérez-Hernández G, Mey AS, Noé F. 2014.. Variational approach to molecular kinetics. . J. Chem. Theory Comput. 10:(4):173952
    [Crossref] [Google Scholar]
  38. 38.
    Pérez-Hernández G, Paul F, Giorgino T, De Fabritiis G, Noé F. 2013.. Identification of slow molecular order parameters for Markov model construction. . J. Chem. Phys. 139:(1):015102
    [Crossref] [Google Scholar]
  39. 39.
    Noé F, Clementi C. 2015.. Kinetic distance and kinetic maps from molecular dynamics simulation. . J. Chem. Theory Comput. 11:(10):500211
    [Crossref] [Google Scholar]
  40. 40.
    Schwantes CR, Pande VS. 2015.. Modeling molecular kinetics with tICA and the kernel trick. . J. Chem. Theory Comput. 11:(2):6008
    [Crossref] [Google Scholar]
  41. 41.
    Chen W, Sidky H, Ferguson AL. 2019.. Nonlinear discovery of slow molecular modes using state-free reversible vampnets. . J. Chem. Phys. 150:(21):214114
    [Crossref] [Google Scholar]
  42. 42.
    Mardt A, Pasquali L, Wu H, Noé F. 2018.. Vampnets for deep learning of molecular kinetics. . Nat. Commun. 9:(1):5
    [Crossref] [Google Scholar]
  43. 43.
    Wu H, Noé F. 2020.. Variational approach for learning Markov processes from time series data. . J. Nonlinear Sci. 30:(1):2366
    [Crossref] [Google Scholar]
  44. 44.
    Shmilovich K, Ferguson AL. 2023.. Girsanov reweighting enhanced sampling technique (GREST): on-the-fly data-driven discovery of and enhanced sampling in slow collective variables. . J. Phys. Chem. A 127:(15):3497517
    [Crossref] [Google Scholar]
  45. 45.
    Wang Y, Tiwary P. 2020.. Understanding the role of predictive time delay and biased propagator in RAVE. . J. Chem. Phys. 152:(14):144102
    [Crossref] [Google Scholar]
  46. 46.
    Bicout D, Szabo A. 1998.. Electron transfer reaction dynamics in non-Debye solvents. . J. Chem. Phys. 109:(6):232538
    [Crossref] [Google Scholar]
  47. 47.
    Bonati L, Piccini G, Parrinello M. 2021.. Deep learning the slow modes for rare events sampling. . PNAS 118:(44):e2113533118
    [Crossref] [Google Scholar]
  48. 48.
    Tiwary P, Parrinello M. 2013.. From metadynamics to dynamics. . Phys. Rev. Lett. 111:(23):230602
    [Crossref] [Google Scholar]
  49. 49.
    Tishby N, Pereira FC, Bialek W. 2000.. The information bottleneck method. . arXiv:physics/0004057 [physics.data-an]
  50. 50.
    Wang Y, Ribeiro JML, Tiwary P. 2019.. Past–future information bottleneck for sampling molecular reaction coordinate simultaneously with thermodynamics and kinetics. . Nat. Commun. 10:(1):3573
    [Crossref] [Google Scholar]
  51. 51.
    Chen W, Ferguson AL. 2018.. Molecular enhanced sampling with autoencoders: on-the-fly collective variable discovery and accelerated free energy landscape exploration. . J. Comput. Chem. 39:(25):2079102
    [Crossref] [Google Scholar]
  52. 52.
    Wehmeyer C, Noé F. 2018.. Time-lagged autoencoders: deep learning of slow collective variables for molecular kinetics. . J. Chem. Phys. 148:(24):241703
    [Crossref] [Google Scholar]
  53. 53.
    Hernández CX, Wayment-Steele HK, Sultan MM, Husic BE, Pande VS. 2018.. Variational encoding of complex dynamics. . Phys. Rev. E 97:(6):062412
    [Crossref] [Google Scholar]
  54. 54.
    Kingma DP, Welling M. 2013.. Auto-encoding variational Bayes. . arXiv:1312.6114 [stat.ML]
  55. 55.
    Wang D, Tiwary P. 2021.. State predictive information bottleneck. . J. Chem. Phys. 154:(13):134111
    [Crossref] [Google Scholar]
  56. 56.
    Pomarici N, Mehdi S, Quoika P, Lee S, Loeffler JR, et al. 2023.. Learning high-dimensional reaction coordinates of fast-folding proteins using state predictive information bottleneck and bias exchange metadynamics. . bioRxiv. https://doi.org/10.1101/2023.07.24.550401
    [Google Scholar]
  57. 57.
    Piana S, Laio A. 2007.. A bias-exchange approach to protein folding. . J. Phys. Chem. B 111:(17):455359
    [Crossref] [Google Scholar]
  58. 58.
    Rizzi V, Mendels D, Sicilia E, Parrinello M. 2019.. Blind search for complex chemical pathways using harmonic linear discriminant analysis. . J. Chem. Theory Comput. 15:(8):450715
    [Crossref] [Google Scholar]
  59. 59.
    Bonati L, Rizzi V, Parrinello M. 2020.. Data-driven collective variables for enhanced sampling. . J. Phys. Chem. Lett. 11:(8):29983004
    [Crossref] [Google Scholar]
  60. 60.
    Sasmal S, McCullagh M, Hocky GM. 2023.. Reaction coordinates for conformational transitions using linear discriminant analysis on positions. . J. Chem. Theory Comput. 19:(14):442735
    [Crossref] [Google Scholar]
  61. 61.
    Tiwary P, Berne B. 2016.. Spectral gap optimization of order parameters for sampling complex molecular systems. . PNAS 113:(11):283944
    [Crossref] [Google Scholar]
  62. 62.
    Smith Z, Pramanik D, Tsai ST, Tiwary P. 2018.. Multi-dimensional spectral gap optimization of order parameters (SGOOP) through conditional probability factorization. . J. Chem. Phys. 149:(23):234105
    [Crossref] [Google Scholar]
  63. 63.
    Tsai ST, Smith Z, Tiwary P. 2021.. SGOOP-d: Estimating kinetic distances and reaction coordinate dimensionality for rare event systems from biased/unbiased simulations. . J. Chem. Theory Comput. 17:(11):675765
    [Crossref] [Google Scholar]
  64. 64.
    Rydzewski J, Nowak W. 2016.. Machine learning based dimensionality reduction facilitates ligand diffusion paths assessment: a case of cyochrome P450cam. . J. Chem. Theory Comput. 12:(4):211020
    [Crossref] [Google Scholar]
  65. 65.
    Zhang J, Chen M. 2018.. Unfolding hidden barriers by active enhanced sampling. . Phys. Rev. Lett. 121:(1):010601
    [Crossref] [Google Scholar]
  66. 66.
    Rydzewski J, Valsson O. 2021.. Multiscale reweighted stochastic embedding: deep learning of collective variables for enhanced sampling. . J. Phys. Chem. A 125:(28):6286302
    [Crossref] [Google Scholar]
  67. 67.
    Sun L, Vandermause J, Batzner S, Xie Y, Clark D, et al. 2022.. Multitask machine learning of collective variables for enhanced sampling of rare events. . J. Chem. Theory Comput. 18:(4):234153
    [Crossref] [Google Scholar]
  68. 68.
    Zimmerman MI, Bowman GR. 2015.. FAST conformational searches by balancing exploration/exploitation trade-offs. . J. Chem. Theory Comput. 11:(12):574757
    [Crossref] [Google Scholar]
  69. 69.
    Shamsi Z, Cheng KJ, Shukla D. 2018.. Reinforcement learning based adaptive sampling: reaping rewards by exploring protein conformational landscapes. . J. Phys. Chem. B 122:(35):838695
    [Crossref] [Google Scholar]
  70. 70.
    Pérez A, Herrera-Nieto P, Doerr S, De Fabritiis G. 2020.. AdaptiveBandit: a multi-armed bandit framework for adaptive sampling in molecular simulations. . J. Chem. Theory Comput. 16:(7):468593
    [Crossref] [Google Scholar]
  71. 71.
    Szepesvári C. 2010.. Algorithms for Reinforcement Learning. San Rafael, CA:: Morgan & Claypool
    [Google Scholar]
  72. 72.
    Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, et al. 2011.. Scikit-learn: machine learning in Python. . J. Mach. Learn. Res. 12::282530
    [Google Scholar]
  73. 73.
    Stecher T, Bernstein N, Csányi G. 2014.. Free energy surface reconstruction from umbrella samples using Gaussian process regression. . J. Chem. Theory Comput. 10:(9):407997
    [Crossref] [Google Scholar]
  74. 74.
    Mones L, Bernstein N, Csányi G. 2016.. Exploration, sampling, and reconstruction of free energy surfaces with Gaussian process regression. . J. Chem. Theory Comput. 12:(10):510010
    [Crossref] [Google Scholar]
  75. 75.
    Galvelis R, Sugita Y. 2017.. Neural network and nearest neighbor algorithms for enhancing sampling of molecular dynamics. . J. Chem. Theory Comput. 13:(6):2489500
    [Crossref] [Google Scholar]
  76. 76.
    Valsson O, Parrinello M. 2014.. Variational approach to enhanced sampling and free energy calculations. . Phys. Rev. Lett. 113:(9):090601
    [Crossref] [Google Scholar]
  77. 77.
    Bonati L, Zhang YY, Parrinello M. 2019.. Neural networks-based variationally enhanced sampling. . PNAS 116:(36):1764147
    [Crossref] [Google Scholar]
  78. 78.
    Guo AZ, Sevgen E, Sidky H, Whitmer JK, Hubbell JA, de Pablo JJ. 2018.. Adaptive enhanced sampling by force-biasing using neural networks. . J. Chem. Phys. 148:(13):134108
    [Crossref] [Google Scholar]
  79. 79.
    Sidky H, Whitmer JK. 2018.. Learning free energy landscapes using artificial neural networks. . J. Chem. Phys. 148:(10):104111
    [Crossref] [Google Scholar]
  80. 80.
    Zhang L, Wang H, E W. 2018.. Reinforced dynamics for enhanced sampling in large atomic and molecular systems. . J. Chem. Phys. 148:(12):124113
    [Crossref] [Google Scholar]
  81. 81.
    Miao Y, Feher VA, McCammon JA. 2015.. Gaussian accelerated molecular dynamics: unconstrained enhanced sampling and free energy calculation. . J. Chem. Theory Comput. 11:(8):358495
    [Crossref] [Google Scholar]
  82. 82.
    Do HN, Miao Y. 2023.. Deep boosted molecular dynamics: accelerating molecular simulations with Gaussian boost potentials generated using probabilistic Bayesian deep neural network. . J. Phys. Chem. Lett. 14::497082
    [Crossref] [Google Scholar]
  83. 83.
    Kobyzev I, Prince SJ, Brubaker MA. 2021.. Normalizing flows: an introduction and review of current methods. . IEEE Trans. Pattern Anal. Mach. Intel. 43:(11):396479
    [Crossref] [Google Scholar]
  84. 84.
    Chen RTQ, Rubanova Y, Bettencourt J, Duvenaud D. 2018.. Neural ordinary differential equations. . arXiv:1806.07366 [cs.LG]
  85. 85.
    Dinh L, Sohl-Dickstein J, Bengio S. 2017.. Density estimation using Real NVP. Paper presented at the 5th International Conference on Learning Representations, Toulon, France:
    [Google Scholar]
  86. 86.
    Dhaka AK, Catalina A, Welandawe M, Andersen MR, Huggins JH, Vehtari A. 2021.. Challenges and opportunities in high dimensional variational inference. . In Advances in Neural Information Processing Systems, Vol. 34, ed. M Ranzato, A Beygelzimer, Y Dauphin, P Liang, JW Vaughan , pp. 778798. Red Hook, NY:: Curran Assoc.
    [Google Scholar]
  87. 87.
    Wu H, Köhler J, Noé F. 2020.. Stochastic normalizing flows. . In Advances in Neural Information Processing Systems, Vol. 33, ed. H Larochelle, M Ranzato, R Hadsell, MF Balcan, H Lin , pp. 593344. Red Hook, NY:: Curran Assoc.
    [Google Scholar]
  88. 88.
    Gabrié M, Rotskoff GM, Vanden-Eijnden E. 2021.. Efficient Bayesian sampling using normalizing flows to assist Markov chain Monte Carlo methods. Paper presented at the ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models, virtual
    [Google Scholar]
  89. 89.
    Gabrié M, Rotskoff GM, Vanden-Eijnden E. 2022.. Adaptive Monte Carlo augmented with normalizing flows. . PNAS 119:(10):e2109420119
    [Crossref] [Google Scholar]
  90. 90.
    Noé F, Olsson S, Köhler J, Wu H. 2019.. Boltzmann generators: sampling equilibrium states of many-body systems with deep learning. . Science 365:(6457):eaaw1147
    [Crossref] [Google Scholar]
  91. 91.
    Invernizzi M, Krämer A, Clementi C, Noé F. 2022.. Skipping the replica exchange ladder with normalizing flows. . J. Phys. Chem. Lett. 13:(50):1164349
    [Crossref] [Google Scholar]
  92. 92.
    Köhler J, Krämer A, Noe F. 2021.. Smooth normalizing flows. . In Advances in Neural Information Processing Systems, Vol. 34, ed. M Ranzato, A Beygelzimer, Y Dauphin, P Liang, JW Vaughan , pp. 2796809. Red Hook, NY:: Curran Assoc.
    [Google Scholar]
  93. 93.
    Shirts MR, Chodera JD. 2008.. Statistically optimal analysis of samples from multiple equilibrium states. . J. Chem. Phys. 129:(12):124105
    [Crossref] [Google Scholar]
  94. 94.
    Jarzynski C. 2002.. Targeted free energy perturbation. . Phys. Rev. E 65:(4):046122
    [Crossref] [Google Scholar]
  95. 95.
    Rizzi A, Carloni P, Parrinello M. 2021.. Targeted free energy perturbation revisited: accurate free energies from mapped reference potentials. . J. Phys. Chem. Lett. 12:(39):944954
    [Crossref] [Google Scholar]
  96. 96.
    Wirnsberger P, Ballard AJ, Papamakarios G, Abercrombie S, Racanière S, et al. 2020.. Targeted free energy estimation via learned mappings. . J. Chem. Phys. 153:(14):144112
    [Crossref] [Google Scholar]
  97. 97.
    Wirnsberger P, Papamakarios G, Ibarz B, Racanière S, Ballard AJ, et al. 2022.. Normalizing flows for atomic solids. . Mach. Learn. Sci. Technol. 3:(2):025009
    [Crossref] [Google Scholar]
  98. 98.
    Ho J, Jain A, Abbeel P. 2020.. Denoising diffusion probabilistic models. . In Advances in Neural Information Processing Systems, Vol. 33, ed. H Larochelle, M Ranzato, R Hadsell, M Balcan, H Lin , pp. 684051. Red Hook, NY:: Curran Assoc.
    [Google Scholar]
  99. 99.
    Song Y, Sohl-Dickstein J, Kingma DP, Kumar A, Ermon S, Poole B. 2021.. Score-based generative modeling through stochastic differential equations. . Paper presented at the 9th International Conference on Learning Representations, virtual
  100. 100.
    Song Y, Ermon S. 2019.. Generative modeling by estimating gradients of the data distribution. . In Advances in Neural Information Processing Systems, Vol. 32, ed. H Wallach, H Larochelle, A Beygelzimer, F d'Alché-Buc, E Fox, R Garnett , pp. 11895907. Red Hook, NY:: Curran Assoc.
    [Google Scholar]
  101. 101.
    Albergo MS, Boffi NM, Vanden-Eijnden E. 2023.. Stochastic interpolants: a unifying framework for flows and diffusions. . arXiv:2303.08797 [cs.LG]
  102. 102.
    Sohl-Dickstein J, Weiss E, Maheswaranathan N, Ganguli S. 2015.. Deep unsupervised learning using nonequilibrium thermodynamics. . Proc. Mach. Learn. Res. 37::225665
    [Google Scholar]
  103. 103.
    Bao F, Li C, Cao Y, Zhu J. 2022.. All are worth words: a ViT backbone for score-based diffusion models. Paper presented at the NeurIPS 2022 Workshop on Score-Based Methods, New Orleans, LA:, Dec. 2
    [Google Scholar]
  104. 104.
    Wang Y, Herron L, Tiwary P. 2022.. From data to noise to data for mixing physics across temperatures with generative artificial intelligence. . PNAS 119:(32):e2203656119
    [Crossref] [Google Scholar]
  105. 105.
    Rogal J, Schneider E, Tuckerman ME. 2019.. Neural-network-based path collective variables for enhanced sampling of phase transformations. . Phys. Rev. Lett. 123:(24):245701
    [Crossref] [Google Scholar]
  106. 106.
    Vani BP, Aranganathan A, Wang D, Tiwary P. 2023.. Alphafold2-RAVE: From sequence to Boltzmann ranking. . J. Chem. Theory Comput. 19:(14):435154
    [Crossref] [Google Scholar]
  107. 107.
    Spiwok V, Kurečka M, Křenek A. 2022.. Collective variable for metadynamics derived from AlphaFold output. . Front. Mol. Biosci. 9::878133
    [Crossref] [Google Scholar]
  108. 108.
    DeFever RS, Targonski C, Hall SW, Smith MC, Sarupria S. 2019.. A generalized deep learning approach for local structure identification in molecular simulations. . Chem. Sci. 10:(32):750315
    [Crossref] [Google Scholar]
  109. 109.
    Nagel D, Sartore S, Stock G. 2023.. Toward a benchmark for Markov state models: the folding of HP35. . J. Phys. Chem. Lett. 14:695667
    [Google Scholar]
  110. 110.
    Mehdi S, Tiwary P. 2022.. Thermodynamics of interpretation. . arXiv:2206.13475 [cond-mat.stat-mech]
  111. 111.
    Jung H, Covino R, Arjun A, Leitold C, Dellago C, Bolhuis PG, Hummer G. 2023.. Machine-guided path sampling to discover mechanisms of molecular self-organization. . Nat. Comput. Sci. 3::33445
    [Crossref] [Google Scholar]
  112. 112.
    Sultan MM, Wayment-Steele HK, Pande VS. 2018.. Transferable neural networks for enhanced sampling of protein dynamics. . J. Chem. Theory Comput. 14:(4):188794
    [Crossref] [Google Scholar]
  113. 113.
    Beyerle ER, Mehdi S, Tiwary P. 2022.. Quantifying energetic and entropic pathways in molecular systems. . J. Phys. Chem. B 126:(21):395060
    [Crossref] [Google Scholar]
  114. 114.
    Wang D, Wang Y, Evans L, Tiwary P. 2022.. From latent dynamics to meaningful representations. . arXiv:2209.00905 [cs.LG]
  115. 115.
    Giberti F, Salvalaglio M, Mazzotti M, Parrinello M. 2015.. Insight into the nucleation of urea crystals from the melt. . Chem. Eng. Sci. 121::5159
    [Crossref] [Google Scholar]
  116. 116.
    Ghorbani M, Prasad S, Klauda JB, Brooks BR. 2022.. GraphVAMPNet, using graph neural networks and variational approach to Markov processes for dynamical modeling of biomolecules. . J. Chem. Phys. 156::184103
    [Crossref] [Google Scholar]
  117. 117.
    Liu B, Xue M, Qiu Y, Konovalov K, O'Connor M, Huang X. 2023.. GraphVAMPnets for uncovering slow collective variables of self-assembly dynamics. . J. Chem. Phys. 159::094901
    [Crossref] [Google Scholar]
/content/journals/10.1146/annurev-physchem-083122-125941
Loading
/content/journals/10.1146/annurev-physchem-083122-125941
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error