1932

Abstract

This article reviews recent developments in the applications of machine learning, data-driven modeling, transfer learning, and autonomous experimentation for the discovery, design, and optimization of soft and biological materials. The design and engineering of molecules and molecular systems have long been a preoccupation of chemical and biomolecular engineers using a variety of computational and experimental techniques. Increasingly, researchers have looked to emerging and established tools in artificial intelligence and machine learning to integrate with established approaches in chemical science to realize powerful, efficient, and in some cases autonomous platforms for molecular discovery, materials engineering, and process optimization. This review summarizes the basic principles underpinning these techniques and highlights recent successful example applications in autonomous materials discovery, transfer learning, and multi-fidelity active learning.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-chembioeng-092120-020803
2022-06-07
2024-04-25
Loading full text...

Full text loading...

/deliver/fulltext/chembioeng/13/1/annurev-chembioeng-092120-020803.html?itemId=/content/journals/10.1146/annurev-chembioeng-092120-020803&mimeType=html&fmt=ahah

Literature Cited

  1. 1.
    Gómez-Bombarelli R, Aguilera-Iparraguirre J, Hirzel TD, Duvenaud D, Maclaurin D et al. 2016. Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach. Nat. Mater. 15:1120–27
    [Google Scholar]
  2. 2.
    Polishchuk PG, Madzhidov TI, Varnek A. 2013. Estimation of the size of drug-like chemical space based on GDB-17 data. J. Comput.-Aided Mol. Des. 27:675–79
    [Google Scholar]
  3. 3.
    Ferguson AL, Ranganathan R. 2021. 100th anniversary of macromolecular science viewpoint: data-driven protein design. ACS Macro Lett 10:327–40
    [Google Scholar]
  4. 4.
    He K, Zhang X, Ren S, Sun J 2016. Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition770–78 New York: IEEE
    [Google Scholar]
  5. 5.
    Gatys LA, Ecker AS, Bethge M. 2016. Image style transfer using convolutional neural networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition2414–23 New York: IEEE
    [Google Scholar]
  6. 6.
    Karras T, Aila T, Laine S, Lehtinen J 2017. Progressive growing of GANs for improved quality, stability, and variation. arXiv:1710.10196 [cs.NE]
  7. 7.
    Yu L, Zhang W, Wang J, Yu Y 2017. SeqGAN: sequence generative adversarial nets with policy gradient. Proc. AAAI Conf. Artif. Intell. 31:2852–58
    [Google Scholar]
  8. 8.
    Ferguson AL. 2017. Machine learning and data science in soft materials engineering. J. Phys. 30:043002
    [Google Scholar]
  9. 9.
    Vu K. 2020. Compute goes brrr: revisiting Sutton's bitter lesson for AI. KD Nuggets Nov. https://www.kdnuggets.com/2020/11/revisiting-sutton-bitter-lesson-ai.html
    [Google Scholar]
  10. 10.
    Cohn DA, Ghahramani Z, Jordan MI 1996. Active learning with statistical models. J. Artif. Intell. Res. 4:129–45
    [Google Scholar]
  11. 11.
    Lin F-R, Shaw MJ. 1997. Active training of backpropagation neural networks using the learning by experimentation methodology. Ann. Oper. Res. 75:105–22
    [Google Scholar]
  12. 12.
    Ling CK, Low KH, Jaillet P. 2016. Gaussian process planning with Lipschitz continuous reward functions: towards unifying Bayesian optimization, active learning, and beyond. Proceedings of the 30th AAAI Conference on Artificial Intelligence1860–66 Palo Alto, CA: AAAI
    [Google Scholar]
  13. 13.
    Lookman T, Balachandran PV, Xue D, Yuan R 2019. Active learning in materials science with emphasis on adaptive sampling using uncertainties for targeted design. npj Comput. Mater. 5:21
    [Google Scholar]
  14. 14.
    Yuan R, Liu Z, Balachandran PV, Xue D, Zhou Y et al. 2018. Accelerated discovery of large electrostrains in BaTiO3-based piezoelectrics using active learning. Adv. Mater. 30:1702884
    [Google Scholar]
  15. 15.
    Guyon I, Elisseeff A. 2003. An introduction to variable and feature selection. J. Mach. Learn. Res. 3:1157–82
    [Google Scholar]
  16. 16.
    Bi J, Bennett K, Embrechts M, Breneman C, Song M. 2003. Dimensionality reduction via sparse support vector machines. J. Mach. Learn. Res. 3:1229–43
    [Google Scholar]
  17. 17.
    Lee EY, Fulan BM, Wong GC, Ferguson AL. 2016. Mapping membrane activity in undiscovered peptide sequence space using machine learning. PNAS 113:13588–93
    [Google Scholar]
  18. 18.
    Gómez-Bombarelli R, Wei JN, Duvenaud D, Hernández-Lobato JM, Sánchez-Lengeling B et al. 2018. Automatic chemical design using a data-driven continuous representation of molecules. ACS Central Sci 4:268–76
    [Google Scholar]
  19. 19.
    Shmilovich K, Panda S, Stouffer A, Tovar J, Ferguson A. 2021. Hybrid computational-experimental data-driven design of self-assembling pi-conjugated peptides. ChemRxiv. https://doi.org/10.33774/chemrxiv-2021-l42ch
    [Crossref]
  20. 20.
    Brochu E, Cora VM, De Freitas N 2010. A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. arXiv:1012.2599 [cs.LG]
  21. 21.
    Paria B, Kandasamy K, Póczos B. 2020. A flexible framework for multi-objective Bayesian optimization using random scalarizations. Proc. Mach. Learn. Res. 115:76676
    [Google Scholar]
  22. 22.
    Reker D, Schneider P, Schneider G. 2016. Multi-objective active machine learning rapidly improves structure–activity models and reveals new protein–protein interaction inhibitors. Chem. Sci. 7:3919–27
    [Google Scholar]
  23. 23.
    Kushner HJ. 1964. A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86:97–106
    [Google Scholar]
  24. 24.
    Jones DR, Schonlau M, Welch WJ. 1998. Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13:455–92
    [Google Scholar]
  25. 25.
    Mockus J, Tiesis V, Zilinskas A. 1978. The application of Bayesian methods for seeking the extremum. In Towards Global OptimisationVol. 2:ed. LCW Dixon, GP Dzegopp. 11729 Amsterdam: North-Holland:
    [Google Scholar]
  26. 26.
    Snoek J, Larochelle H, Adams RP. 2012. Practical Bayesian optimization of machine learning algorithms. Adv. Neural Inform. Proc. Syst. 25:1–9
    [Google Scholar]
  27. 27.
    Howard RA. 1966. Information value theory. IEEE Trans. Syst. Sci. Cybernet. 2:22–26
    [Google Scholar]
  28. 28.
    Huan X, Marzouk YM. 2013. Simulation-based optimal Bayesian experimental design for nonlinear systems. J. Comput. Phys. 232:288–317
    [Google Scholar]
  29. 29.
    Cavazzuti M. 2012. Optimization Methods: From Theory to Design: Scientific and Technological Aspects in Mechanics Berlin/Heidelberg, Ger: Springer
  30. 30.
    Fisher RA. 1937. The Design of Experiments Edinburgh: Oliver & Boyd
  31. 31.
    Peherstorfer B, Willcox K, Gunzburger M. 2018. Survey of multifidelity methods in uncertainty propagation, inference, and optimization. SIAM Rev 60:550–91
    [Google Scholar]
  32. 32.
    Ebden M. 2015. Gaussian processes: a quick introduction. arXiv:1505.02965 [math.ST]
  33. 33.
    Rasmussen CE, Williams CKI. 2005. Gaussian Processes for Machine Learning Cambridge, MA: MIT Press
  34. 34.
    Görtler J, Kehlbeck R, Deussen O. 2019. A visual exploration of Gaussian processes. Distill April 2. https://distill.pub/2019/visual-exploration-gaussian-processes/
    [Google Scholar]
  35. 35.
    Liang Q, Gongora AE, Ren Z, Tiihonen A, Liu Z et al. 2021. Benchmarking the performance of Bayesian optimization across multiple experimental materials science domains. npj Comput. Mater. 7:188
    [Google Scholar]
  36. 36.
    Kim C, Chandrasekaran A, Jha A, Ramprasad R 2019. Active-learning and materials design: the example of high glass transition temperature polymers. MRS Commun 9:860–66
    [Google Scholar]
  37. 37.
    Czechtizky W, Dedio JE, Desai B, Dixon K, Farrant E et al. 2013. Integrated synthesis and testing of substituted xanthine based DPP4 inhibitors: application to drug discovery. ACS Med. Chem. Lett. 4:768–72
    [Google Scholar]
  38. 38.
    Chen S, Reyes K-RG, Gupta MK, McAlpine MC, Powell WB 2015. Optimal learning in experimental design using the knowledge gradient policy with application to characterizing nanoemulsion stability. SIAM/ASA J. Uncertain. Quantif. 3:320–45
    [Google Scholar]
  39. 39.
    Shmilovich K, Mansbach RA, Sidky H, Dunne OE, Panda SS et al. 2020. Discovery of self-assembling π-conjugated peptides by active learning-directed coarse-grained molecular simulation. J. Phys. Chem. B 124:3873–91
    [Google Scholar]
  40. 40.
    Ghanakota P, Bos PH, Konze KD, Staker J, Marques G et al. 2020. Combining cloud-based free-energy calculations, synthetically aware enumerations, and goal-directed generative machine learning for rapid large-scale chemical exploration and optimization. J. Chem. Inform. Model. 60:4311–25
    [Google Scholar]
  41. 41.
    Konze KD, Bos PH, Dahlgren MK, Leswing K, Tubert-Brohman I et al. 2019. Reaction-based enumeration, active learning, and free energy calculations to rapidly explore synthetically tractable chemical space and optimize potency of cyclin-dependent kinase 2 inhibitors. J. Chem. Inform. Model. 59:3782–93
    [Google Scholar]
  42. 42.
    Svensson F, Norinder U, Bender A. 2017. Improving screening efficiency through iterative screening using docking and conformal prediction. J. Chem. Inform. Model. 57:439–44
    [Google Scholar]
  43. 43.
    Ahmed L, Georgiev V, Capuccini M, Toor S, Schaal W et al. 2018. Efficient iterative virtual screening with Apache Spark and conformal prediction. J. Cheminform. 10:8
    [Google Scholar]
  44. 44.
    Gentile F, Agrawal V, Hsing M, Ton A-T, Ban F et al. 2020. Deep docking: a deep learning platform for augmentation of structure based drug discovery. ACS Central Sci 6:939–49
    [Google Scholar]
  45. 45.
    Yang Y, Yao K, Repasky MP, Leswing K, Abel R et al. 2021. Efficient exploration of chemical space with docking and deep-learning. ChemRxiv 14153819. https://doi.org/10.26434/chemrxiv.14153819.v1
    [Crossref]
  46. 46.
    Graff DE, Shakhnovich EI, Coley CW. 2021. Accelerating high-throughput virtual screening through molecular pool-based active learning. Chem. Sci. 12:7866–81
    [Google Scholar]
  47. 47.
    Pyzer-Knapp EO. 2020. Using Bayesian optimization to accelerate virtual screening for the discovery of therapeutics appropriate for repurposing for COVID-19. arXiv:2005.07121 [q-bio.BM]
  48. 48.
    Podryabinkin EV, Shapeev AV. 2017. Active learning of linearly parametrized interatomic potentials. Comput. Mater. Sci. 140:171–80
    [Google Scholar]
  49. 49.
    Smith JS, Nebgen B, Lubbers N, Isayev O, Roitberg AE. 2018. Less is more: sampling chemical space with active learning. J. Chem. Phys. 148:241733
    [Google Scholar]
  50. 50.
    Stach E, DeCost B, Kusne AG, Hattrick-Simpers J, Brown KA et al. 2021. Autonomous experimentation systems for materials development: a community perspective. Matter 4:2702–26
    [Google Scholar]
  51. 51.
    Häse F, Roch LM, Aspuru-Guzik A. 2019. Next-generation experimentation with self-driving laboratories. Trends Chem 1:282–91
    [Google Scholar]
  52. 52.
    King RD, Whelan KE, Jones FM, Reiser PG, Bryant CH et al. 2004. Functional genomic hypothesis generation and experimentation by a robot scientist. Nature 427:247–52
    [Google Scholar]
  53. 53.
    Epps RW, Bowen MS, Volk AA, Abdel-Latif K, Han S et al. 2020. Artificial chemist: an autonomous quantum dot synthesis bot. Adv. Mater. 32:2001626
    [Google Scholar]
  54. 54.
    Nikolaev P, Hooper D, Webber F, Rao R, Decker K et al. 2016. Autonomy in materials research: A case study in carbon nanotube growth. npj Comput. Mater. 2:16031
    [Google Scholar]
  55. 55.
    Gongora AE, Xu B, Perry W, Okoye C, Riley P et al. 2020. A Bayesian experimental autonomous researcher for mechanical design. Sci. Adv. 6:eaaz1708
    [Google Scholar]
  56. 56.
    Grizou J, Points LJ, Sharma A, Cronin L. 2020. A curious formulation robot enables the discovery of a novel protocell behavior. Sci. Adv. 6:eaay4237
    [Google Scholar]
  57. 57.
    Kelty M, Morris W, Gallagher A, Anderson J, Brown K et al. 2016. High-throughput synthesis and characterization of nanocrystalline porphyrinic zirconium metal–organic frameworks. Chem. Commun. 52:7854–57
    [Google Scholar]
  58. 58.
    Alsharif N, Uzarski JR, Lawton TJ, Brown KA 2020. High-throughput multiobjective optimization of patterned multifunctional surfaces. ACS Appl. Mater. Interfaces 12:32069–77
    [Google Scholar]
  59. 59.
    Nasrullah MJ, Bahr JA, Gallagher-Lein C, Webster DC, Roesler RR, Schmitt P. 2009. Automated parallel polyurethane dispersion synthesis and characterization. J. Coat. Technol. Res. 6:1–10
    [Google Scholar]
  60. 60.
    Coley CW, Eyke NS, Jensen KF. 2020. Autonomous discovery in the chemical sciences part I: progress. Angew. Chem. Int. Ed. 59:22858–93
    [Google Scholar]
  61. 61.
    Coley CW, Eyke NS, Jensen KF. 2020. Autonomous discovery in the chemical sciences part II: outlook. Angew. Chem. Int. Ed. 59:23414–36
    [Google Scholar]
  62. 62.
    MacLeod BP, Parlane FG, Morrissey TD, Häse F, Roch LM et al. 2020. Self-driving laboratory for accelerated discovery of thin-film materials. Sci. Adv. 6:eaaz8867
    [Google Scholar]
  63. 63.
    Langner S, Häse F, Perea JD, Stubhan T, Hauch J et al. 2020. Beyond ternary OPV: high-throughput experimentation and self-driving laboratories optimize multicomponent systems. Adv. Mater. 32:1907801
    [Google Scholar]
  64. 64.
    Sun S, Hartono NT, Ren ZD, Oviedo F, Buscemi AM et al. 2019. Accelerated development of perovskite-inspired materials via high-throughput synthesis and machine-learning diagnosis. Joule 3:1437–51
    [Google Scholar]
  65. 65.
    Zhao Y, Zhang J, Xu Z, Sun S, Langner S et al. 2021. Discovery of temperature-induced stability reversal in perovskites using high-throughput robotic learning. Nat. Commun. 12:2191
    [Google Scholar]
  66. 66.
    Li Z, Najeeb MA, Alves L, Sherman AZ, Shekar V et al. 2020. Robot-accelerated perovskite investigation and discovery. Chem. Mater. 32:5650–63
    [Google Scholar]
  67. 67.
    Myers DE. 1982. Matrix formulation of co-kriging. J. Int. Assoc. Math. Geol. 14:249–57
    [Google Scholar]
  68. 68.
    Perdikaris P, Venturi D, Royset JO, Karniadakis GE. 2015. Multi-fidelity modelling via recursive co-kriging and Gaussian–Markov random fields. Proc. R. Soc. A 471:20150018
    [Google Scholar]
  69. 69.
    Perdikaris P, Raissi M, Damianou A, Lawrence ND, Karniadakis GE 2017. Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling. Proc. R. Soc. A 473:20160751
    [Google Scholar]
  70. 70.
    Kennedy MC, O'Hagan A. 2000. Predicting the output from a complex computer code when fast approximations are available. Biometrika 87:1–13
    [Google Scholar]
  71. 71.
    Matheron G. 1963. Principles of geostatistics. Econ. Geol. 58:1246–66
    [Google Scholar]
  72. 72.
    Krige DG. 1951. A statistical approach to some mine valuation and allied problems on the Witwatersrand Diss., Univ. Witwatersrand Johannesburg, S. Afr:.
  73. 73.
    Liu H, Ong Y-S, Shen X, Cai J. 2020. When Gaussian process meets big data: a review of scalable GPs. IEEE Trans. Neural Netw. Learn. Syst. 31:4405–23
    [Google Scholar]
  74. 74.
    Damianou A, Lawrence ND. 2013. Deep Gaussian processes. Proc. Mach. Learn. Res. 31:207–15
    [Google Scholar]
  75. 75.
    Cutajar K, Pullin M, Damianou A, Lawrence N, González J 2019. Deep Gaussian processes for multi-fidelity modeling. arXiv:1903.07320 [stat.ML]
  76. 76.
    Duvenaud DK. 2014. Automatic model construction with Gaussian processes PhD Diss., Univ. Cambridge Cambridge, UK:
  77. 77.
    Azimi J, Jalali A, Fern X. 2012. Hybrid batch Bayesian optimization. arXiv:1202.5597 [cs.AI]
  78. 78.
    Ginsbourger D, Le Riche R, Carraro L 2008. A multi-points criterion for deterministic parallel global optimization based on Gaussian processes. HAL 00260579. https://hal.archives-ouvertes.fr/hal-00260579
  79. 79.
    Babaee H, Perdikaris P, Chryssostomidis C, Karniadakis G 2016. Multi-fidelity modelling of mixed convection based on experimental correlations and numerical simulations. J. Fluid Mech. 809:895–917
    [Google Scholar]
  80. 80.
    Forrester AI, Sóbester A, Keane AJ. 2007. Multi-fidelity optimization via surrogate modelling. Proc. R. Soc. A 463:3251–69
    [Google Scholar]
  81. 81.
    Pilania G, Gubernatis JE, Lookman T. 2017. Multi-fidelity machine learning models for accurate bandgap predictions of solids. Comput. Mater. Sci. 129:156–63
    [Google Scholar]
  82. 82.
    Tran A, Tranchida J, Wildey T, Thompson AP 2020. Multi-fidelity machine-learning with uncertainty quantification and Bayesian optimization for materials design: application to ternary random alloys. J. Chem. Phys. 153:074705
    [Google Scholar]
  83. 83.
    Razi M, Narayan A, Kirby R, Bedrov D 2018. Fast predictive models based on multi-fidelity sampling of properties in molecular dynamics simulations. Comput. Mater. Sci. 152:125–33
    [Google Scholar]
  84. 84.
    Batra R, Sankaranarayanan S. 2020. Machine learning for multi-fidelity scale bridging and dynamical simulations of materials. J. Phys. 3:031002
    [Google Scholar]
  85. 85.
    Kumar RJ, MacDonald JM, Singh TB, Waddington LJ, Holmes AB. 2011. Hierarchical self-assembly of semiconductor functionalized peptide α-helices and optoelectronic properties. J. Am. Chem. Soc. 133:8564–73
    [Google Scholar]
  86. 86.
    Khalily MA, Bakan G, Kucukoz B, Topal AE, Karatay A et al. 2017. Fabrication of supramolecular n/p-nanowires via coassembly of oppositely charged peptide-chromophore systems in aqueous media. ACS Nano 11:6881–92
    [Google Scholar]
  87. 87.
    Tsai W-W, Tevis ID, Tayi AS, Cui H, Stupp SI. 2010. Semiconducting nanowires from hairpin-shaped self-assembling sexithiophenes. J. Phys. Chem. B 114:14778–86
    [Google Scholar]
  88. 88.
    Lee T, Panda SS, Tovar JD, Katz HE. 2020. Unusually conductive organic–inorganic hybrid nanostructures derived from bio-inspired mineralization of peptide/pi-electron assemblies. ACS Nano 14:1846–55
    [Google Scholar]
  89. 89.
    Panda SS, Katz HE, Tovar JD. 2018. Solid-state electrical applications of protein and peptide based nanomaterials. Chem. Soc. Rev. 47:3640–58
    [Google Scholar]
  90. 90.
    Ardoña HAM, Tovar JD. 2015. Peptide pi-electron conjugates: Organic electronics for biology?. Bioconjug. Chem. 26:2290–302
    [Google Scholar]
  91. 91.
    Eakins GL, Pandey R, Wojciechowski JP, Zheng HY, Webb JE et al. 2015. Functional organic semiconductors assembled via natural aggregating peptides. Adv. Funct. Mater. 25:5640–49
    [Google Scholar]
  92. 92.
    Pan SJ, Yang Q. 2009. A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22:1345–59
    [Google Scholar]
  93. 93.
    Smith JS, Nebgen BT, Zubatyuk R, Lubbers N, Devereux C et al. 2019. Approaching coupled cluster accuracy with a general-purpose neural network potential through transfer learning. Nat. Commun. 10:2903
    [Google Scholar]
  94. 94.
    Gongora AE, Snapp KL, Whiting E, Riley P, Reyes KG et al. 2021. Using simulation to accelerate autonomous experimentation: a case study using mechanics. iScience 24:102262
    [Google Scholar]
  95. 95.
    Ramakrishnan R, Dral PO, Rupp M, von Lilienfeld OA 2015. Big data meets quantum chemistry approximations: the Δ-machine learning approach. J. Chem. Theory Comput. 11:2087–96
    [Google Scholar]
  96. 96.
    Munshi J, Chen W, Chien T, Balasubramanian G 2021. Transfer learned designer polymers for organic solar cells. J. Chem. Inform. Model. 61:134–42
    [Google Scholar]
  97. 97.
    Wu S, Kondo Y, Kakimoto M-A, Yang B, Yamada H et al. 2019. Machine-learning-assisted discovery of polymers with high thermal conductivity using a molecular design algorithm. npj Comput. Mater. 5:66
    [Google Scholar]
  98. 98.
    Otsuka S, Kuwajima I, Hosoya J, Xu Y, Yamazaki M. 2011. PoLyInfo: polymer database for polymeric materials design. Proceedings of the 2011 International Conference on Emerging Intelligent Data and Web Technologies22–29 New York: IEEE
    [Google Scholar]
  99. 99.
    Moradzadeh A, Aluru NR. 2019. Transfer-learning-based coarse-graining method for simple fluids: toward deep inverse liquid-state theory. J. Phys. Chem. Lett. 10:1242–50
    [Google Scholar]
  100. 100.
    Lin C, Li Z, Lu L, Cai S, Maxey M, Karniadakis GE. 2021. Operator learning for predicting multiscale bubble growth dynamics. J. Chem. Phys. 154:104118
    [Google Scholar]
  101. 101.
    Dumortier L, Mossa S. 2020. From ionic surfactants to Nafion through convolutional neural networks. J. Phys. Chem. B 124:8918–27
    [Google Scholar]
  102. 102.
    Lu T, Yu F, Xue C, Han B 2021. Identification, classification, and quantification of three physical mechanisms in oil-in-water emulsions using AlexNet with transfer learning. J. Food Eng. 288:110220
    [Google Scholar]
  103. 103.
    Krizhevsky A, Sutskever I, Hinton GE. 2012. Imagenet classification with deep convolutional neural networks. Adv. Neural Inform. Proc. Syst. 25:1097–105
    [Google Scholar]
  104. 104.
    Khor JW, Jean N, Luxenberg ES, Ermon S, Tang SK 2019. Using machine learning to discover shape descriptors for predicting emulsion stability in a microfluidic channel. Soft Matter 15:1361–72
    [Google Scholar]
  105. 105.
    Turchenko V, Chalmers E, Luczak A 2017. A deep convolutional auto-encoder with pooling-unpooling layers in Caffe. arXiv:1701.04949 [cs.NE]
  106. 106.
    Bogatskiy A, Anderson B, Offermann J, Roussi M, Miller D, Kondor R. 2020. Lorentz group equivariant neural network for particle physics. Proceedings of the 37th International Conference on Machine Learning992–1002 San Diego, CA: Int. Conf. Mach. Learn.
    [Google Scholar]
  107. 107.
    Cohen T, Welling M. 2016. Group equivariant convolutional networks. Proceedings of the 33rd International Conference on Machine Learning2990–99 San Diego, CA: Int. Conf. Mach. Learn.
    [Google Scholar]
  108. 108.
    Batzner S, Smidt TE, Sun L, Mailoa JP, Kornbluth M et al. 2021. E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. arXiv:2101.03164 [physics.comp-ph]
  109. 109.
    Beucler T, Pritchard M, Rasp S, Ott J, Baldi P, Gentine P. 2021. Enforcing analytic constraints in neural networks emulating physical systems. Phys. Rev. Lett. 126:098302
    [Google Scholar]
  110. 110.
    Haghighatlari M, Li J, Guan X, Zhang O, Das A et al. 2021. NewtonNet: a Newtonian message passing network for deep learning of interatomic potentials and forces. arXiv:2108.02913 [physics.chem-ph]
  111. 111.
    Raissi M, Perdikaris P, Karniadakis GE. 2019. Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 378:686–707
    [Google Scholar]
  112. 112.
    Ferguson AL. 2018. ACS Central Science virtual issue on machine learning. ACS Central Sci 4:938–41
    [Google Scholar]
  113. 113.
    Samek W, Montavon G, Vedaldi A, Hansen LK, Müller K-R. 2019. Explainable AI: Interpreting, Explaining and Visualizing Deep Learning Berlin: Springer Nat.
  114. 114.
    Thomas N, Smidt T, Kearnes S, Yang L, Li L et al. 2018. Tensor field networks: rotation- and translation-equivariant neural networks for 3D point clouds. arXiv:1802.08219 [cs.LG]
/content/journals/10.1146/annurev-chembioeng-092120-020803
Loading
/content/journals/10.1146/annurev-chembioeng-092120-020803
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error