1932

Abstract

Machine learning, applied to chemical and materials data, is transforming the field of materials discovery and design, yet significant work is still required to fully take advantage of machine learning algorithms, tools, and methods. Here, we review the accomplishments to date of the community and assess the maturity of state-of-the-art, data-intensive research activities that combine perspectives from materials science and chemistry. We focus on three major themes—learning to see, learning to estimate, and learning to search materials—to show how advanced computational learning technologies are rapidly and successfully used to solve materials and chemistry problems. Additionally, we discuss a clear path toward a future where data-driven approaches to materials discovery and design are standard practice.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-matsci-082019-105100
2020-07-01
2024-06-20
Loading full text...

Full text loading...

/deliver/fulltext/matsci/50/1/annurev-matsci-082019-105100.html?itemId=/content/journals/10.1146/annurev-matsci-082019-105100&mimeType=html&fmt=ahah

Literature Cited

  1. 1. 
    Natl. Sci. Technol. Counc 2011. Materials Genome Initiative for Global Competitiveness White Pap., Comm. Technol., Natl. Sci. Technol. Counc. Washington, DC: https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/materials_genome_initiative-final.pdf
    [Google Scholar]
  2. 2. 
    Natl. Res. Counc. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security Washington, DC: Natl. Acad. Press
    [Google Scholar]
  3. 3. 
    Holm EA 2019. In defense of the black box. Science 364:26–27
    [Google Scholar]
  4. 4. 
    Zhang W, Du Y, Chen L, Peng Y, Zhou P et al. 2014. Design of new gradient cemented carbides and hard coatings through ceramic genome. High Temperature Ceramic Matrix Composites 8 L Zhang, D Jiang 1–13 Hoboken, NJ: Wiley
    [Google Scholar]
  5. 5. 
    Mauro J, Tandia A, Vargheese K, Mauro Y, Smedskjaer M 2016. Accelerating the design of functional glasses through modeling. Chem. Mater. 28:4267–77
    [Google Scholar]
  6. 6. 
    Qian C, Siler T, Ozin G 2015. Exploring the possibilities and limitations of a nanomaterials genome. Small 11:64–69
    [Google Scholar]
  7. 7. 
    Boyd P, Lee Y, Smit B 2017. Computational development of the nanoporous materials genome. Nat. Rev. Mater. 2:17037
    [Google Scholar]
  8. 8. 
    Mannodi-Kanakkithodi A, Chandrasekaran A, Kim C, Huan T, Pilania G et al. 2018. Scoping the polymer genome: a roadmap for rational polymer dielectrics design and beyond. Mater. Today 21:785–96
    [Google Scholar]
  9. 9. 
    Amsler M, Hegde V, Jacobsen S, Wolverton C 2018. Exploring the high-pressure materials genome. Phys. Rev. X 8:041021
    [Google Scholar]
  10. 10. 
    Peng B, Goodsell J, Pipes R, Yu W 2016. Generalized free-edge stress analysis using mechanics of structure genome. J. Appl. Mech 83:101013
    [Google Scholar]
  11. 11. 
    Ren F, Ward L, Williams T, Laws KJ, Wolverton C et al. 2018. Accelerated discovery of metallic glasses through iteration of machine learning and high-throughput experiments. Sci. Adv. 4:eaaq1566
    [Google Scholar]
  12. 12. 
    Warren JA 2018. The Materials Genome Initiative and artificial intelligence. MRS Bull. 43:452–57
    [Google Scholar]
  13. 13. 
    Mitchell TM 1997. Machine Learning Boston: WCB McGraw-Hill
    [Google Scholar]
  14. 14. 
    Todeschini R, Consonni V 2009. Molecular Descriptors for Chemoinformatics 2 Appendices, References Weinheim, Ger.: Wiley-VCH. 2nd ed.
    [Google Scholar]
  15. 15. 
    Young SS, Yuan F, Zhu M 2012. Chemical descriptors are more important than learning algorithms for modelling. Mol. Inform. 31:707–10
    [Google Scholar]
  16. 16. 
    Polishchuk PG, Kuźmin VE, Artemenko AG, Muratov EN 2013. Universal approach for structural interpretation of QSAR/QSPR models. Mol. Inform. 32:843–53
    [Google Scholar]
  17. 17. 
    Todeschini R, Consonni V 2009. Molecular Descriptors for Chemoinformatics 1 Alphabetical Listing Weinheim, Ger.: Wiley-VCH. 2nd ed.
    [Google Scholar]
  18. 18. 
    Weininger D 1988. SMILES, a chemical language and information system. 1. Introduction to methodology and encoding rules. J. Chem. Inf. Comput. Sci. 28:31–36
    [Google Scholar]
  19. 19. 
    Heller S, McNaught A, Stein S, Tchekhovskoi D, Pletnev I 2013. InChI—the worldwide chemical structure identifier standard. J. Cheminform. 5:7
    [Google Scholar]
  20. 20. 
    Durant JL, Leland BA, Henry DR, Nourse JG 2002. Reoptimization of MDL keys for use in drug discovery. J. Chem. Inf. Comput. Sci. 42:1273–80
    [Google Scholar]
  21. 21. 
    Rogers D, Hahn M 2010. Extended-connectivity fingerprints. J. Chem. Inf. Model. 50:742–54
    [Google Scholar]
  22. 22. 
    Liu S, Chandereng T, Liang Y 2018. N-gram graph, a novel molecule representation. arXiv:190609206 [cs.LG]
    [Google Scholar]
  23. 23. 
    Behler J, Parrinello M 2007. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett. 98:146401
    [Google Scholar]
  24. 24. 
    Bartók AP, Kondor R, Csányi G 2013. On representing chemical environments. Phys. Rev. B 87:184115
    [Google Scholar]
  25. 25. 
    Rupp M, Tkatchenko A, Müller KR, von Lilienfeld OA 2012. Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett. 108:058301
    [Google Scholar]
  26. 26. 
    Hansen K, Biegler F, Ramakrishnan R, Pronobis W, von Lilienfeld OA et al. 2015. Machine learning predictions of molecular properties: accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett. 6:2326–31
    [Google Scholar]
  27. 27. 
    Huang B, von Lilienfeld OA 2016. Communication. Understanding molecular representations in machine learning: the role of uniqueness and target similarity. J. Chem. Phys. 145:161102
    [Google Scholar]
  28. 28. 
    Faber FA, Lindmaa A, von Lilienfeld OA, Armiento R 2016. Machine learning energies of 2 million elpasolite (ABC2D6) crystals. Phys. Rev. Lett. 117:135502
    [Google Scholar]
  29. 29. 
    Huang B, von Lilienfeld OA 2017. The “DNA” of chemistry: scalable quantum machine learning with “amons.”. arXiv:1707.04146 [physics.chem-ph]
    [Google Scholar]
  30. 30. 
    Faber FA, Hutchison L, Huang B, Gilmer J, Schoenholz SS et al. 2017. Prediction errors of molecular machine learning models lower than hybrid DFT error. J. Chem. Theory Comput. 13:5255–64
    [Google Scholar]
  31. 31. 
    Kuzminykh D, Polykovskiy D, Kadurin A, Zhebrak A, Baskov I et al. 2018. 3D molecular representations based on the wave transform for convolutional neural networks. Mol. Pharm. 15:4378–85
    [Google Scholar]
  32. 32. 
    Skalic M, Jiménez J, Sabbadin D, De Fabritiis G 2019. Shape-based generative modeling for de novo drug design. J. Chem. Inf. Model. 59:1205–14
    [Google Scholar]
  33. 33. 
    Huo H, Rupp M 2017. Unified representation of molecules and crystals for machine learning. arXiv:1704.06439 [physics.chem-ph]
    [Google Scholar]
  34. 34. 
    Eickenberg M, Exarchakis G, Hirn M, Mallat S 2017. Solid harmonic wavelet scattering: predicting quantum molecular energy from invariant descriptors of 3D electronic densities. Adv. Neural Inf. Process. Syst. 30:6522–31
    [Google Scholar]
  35. 35. 
    Elsken T, Metzen JH, Hutter F 2019. Neural architecture search: a survey. J. Mach. Learn. Res. 20:1–21
    [Google Scholar]
  36. 36. 
    Duvenaud D, Maclaurin D, Aguilera-Iparraguirre J, Gómez-Bombarelli R, Hirzel T et al. 2015. Convolutional networks on graphs for learning molecular fingerprints. Adv. Neural Inf. Process. Syst. 28:2224–32
    [Google Scholar]
  37. 37. 
    Kearnes S, McCloskey K, Berndl M, Pande V, Riley P 2016. Molecular graph convolutions: moving beyond fingerprints. J. Comput.-Aided Mol. Des. 30:595–608
    [Google Scholar]
  38. 38. 
    Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE 2017. Neural message passing for quantum chemistry. Proc. Mach. Learn. Res. 70:1263–72
    [Google Scholar]
  39. 39. 
    Smith JS, Isayev O, Roitberg AE 2017. ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci. 8:3192–203
    [Google Scholar]
  40. 40. 
    Thomas N, Smidt T, Kearnes S, Yang L, Li L et al. 2018. Tensor field networks: rotation- and translation-equivariant neural networks for 3D point clouds. arXiv:1802.08219 [cs.LG]
    [Google Scholar]
  41. 41. 
    Schütt KT, Arbabzadah F, Chmiela S, Müller KR, Tkatchenko A 2017. Quantum-chemical insights from deep tensor neural networks. Nat. Commun. 8:13890
    [Google Scholar]
  42. 42. 
    Schütt KT, Sauceda HE, Kindermans PJ, Tkatchenko A, Müller KR 2018. SchNet—a deep learning architecture for molecules and materials. J. Chem. Phys. 148:241722
    [Google Scholar]
  43. 43. 
    Ramsundar B, Kearnes S, Riley P, Webster D, Konerding D, Pande V 2015. Massively multitask networks for drug discovery. arXiv:1502.02072 [stat.ML]
    [Google Scholar]
  44. 44. 
    Ruder S 2017. An overview of multi-task learning in deep neural networks. arXiv:1706.05098 [cs.LG]
    [Google Scholar]
  45. 45. 
    Fare C, Turcani L, Pyzer-Knapp EO 2018. Powerful, transferable representations for molecules through intelligent task selection in deep multitask networks. arXiv:1809.06334 [physics.chem-ph]
    [Google Scholar]
  46. 46. 
    Flemings M 1999. What next for departments of materials science and engineering?. Annu. Rev. Mater. Sci. 29:1–23
    [Google Scholar]
  47. 47. 
    Pollock TM, Van der Ven A 2019. The evolving landscape for alloy design. MRS Bull. 44:238–46
    [Google Scholar]
  48. 48. 
    Zunger A 1980. Systematization of the stable crystal structure of all AB-type binary compounds: a pseudopotential orbital-radii approach. Phys. Rev. B 22:5839–72
    [Google Scholar]
  49. 49. 
    Suh C, Rajan K 2009. Data mining and informatics for crystal chemistry: establishing measurement techniques for mapping structure-property relationships. Mater. Sci. Technol. 25:466–71
    [Google Scholar]
  50. 50. 
    Ye W, Chen C, Wang Z, Chu IH, Ong S 2018. Deep neural networks for accurate predictions of crystal stability. Nat. Commun. 9:3800
    [Google Scholar]
  51. 51. 
    Himanen L, Rinke P, Foster A 2018. Materials structure genealogy and high-throughput topological classification of surfaces and 2D materials. NPJ Comput. Mater. 4:52
    [Google Scholar]
  52. 52. 
    Graser J, Kauwe S, Sparks T 2018. Machine learning and energy minimization approaches for crystal structure predictions: a review and new horizons. Chem. Mater. 30:3601–12
    [Google Scholar]
  53. 53. 
    Oganov A, Pickard C, Zhu Q, Needs R 2019. Structure prediction drives materials discovery. Nat. Rev. Mater. 4:331–48
    [Google Scholar]
  54. 54. 
    McCue I, Stuckner J, Murayama M, Demkowicz M 2018. Gaining new insights into nanoporous gold by mining and analysis of published images. Sci. Rep. 8:6761
    [Google Scholar]
  55. 55. 
    Du P, Zebrowski A, Zola J, Ganapathysubramanian B, Wodo O 2018. Microstructure design using graphs. NPJ Comput. Mater. 4:50
    [Google Scholar]
  56. 56. 
    Yang Z, Li X, Brinson L, Choudhary A, Chen W, Agrawal A 2018. Microstructural materials design via deep adversarial learning methodology. J. Mech. Des. 140:111416
    [Google Scholar]
  57. 57. 
    Ziatdinov M, Dyck O, Maksov A, Li X, Sang X et al. 2017. Deep learning of atomically resolved scanning transmission electron microscopy images: chemical identification and tracking local transformations. ACS Nano 11:12742–52
    [Google Scholar]
  58. 58. 
    Madsen J, Liu P, Kling J, Wagner J, Hansen T et al. 2018. A deep learning approach to identify local structures in atomic-resolution transmission electron microscopy images. Adv. Theory Simul. 1:1800037
    [Google Scholar]
  59. 59. 
    Vasudevan R, Laanait N, Ferragut E, Wang K, Geohegan D et al. 2018. Mapping mesoscopic phase evolution during E-beam induced transformations via deep learning of atomically resolved images. NPJ Comput. Mater. 4:30
    [Google Scholar]
  60. 60. 
    Xu W, LeBeau J 2018. A deep convolutional neural network to analyze position averaged convergent beam electron diffraction patterns. Ultramicroscopy 188:59–69
    [Google Scholar]
  61. 61. 
    Jensen Z, Kim E, Kwon S, Gani T, Román-Leshkov Y et al. 2019. A machine learning approach to zeolite synthesis enabled by automatic literature data extraction. ACS Cent. Sci 5:892–99
    [Google Scholar]
  62. 62. 
    Kim E, Huang K, Tomala A, Matthews S, Strubell E et al. 2017. Machine-learned and codified synthesis parameters of oxide materials. Sci. Data 4:170127
    [Google Scholar]
  63. 63. 
    Kim E, Jensen Z, van Grootel A, Huang K, Staib M et al. 2019. Inorganic materials synthesis planning with literature-trained neural networks. arXiv:1901.00032 [cond-mat.mtrl.sci]
    [Google Scholar]
  64. 64. 
    Court C, Cole J 2018. Auto-generated materials database of Curie and Néel temperatures via semi-supervised relationship extraction. Sci. Data 5:180111
    [Google Scholar]
  65. 65. 
    Young S, Maksov A, Ziatdinov M, Cao Y, Burch M et al. 2018. Data mining for better material synthesis: the case of pulsed laser deposition of complex oxides. J. Appl. Phys. 123:115303
    [Google Scholar]
  66. 66. 
    Onishi T, Kadohira T, Watanage I 2018. Relation extraction with weakly supervised learning based on process-structure-property-performance reciprocity. J. Sci. Technol. Adv. Mater. 19:649–59
    [Google Scholar]
  67. 67. 
    Kim E, Huang K, Kononova O, Ceder G, Olivetti E 2019. Distilling a materials synthesis ontology. Matter Opin. 1:8–12
    [Google Scholar]
  68. 68. 
    Schwaller P, Gaudin T, Lányi D, Bekas C, Laino T 2018. “Found in translation”: predicting outcomes of complex organic chemistry reactions using neural sequence-to-sequence models. Chem. Sci. 9:6091–98
    [Google Scholar]
  69. 69. 
    Raccuglia P, Elbert K, Adler P, Falk C, Wenny M et al. 2016. Machine-learning-assisted materials discovery using failed experiments. Nature 533:73–76
    [Google Scholar]
  70. 70. 
    Dimitrov T, Kreisbeck C, Becker J, Aspuru-Guzik A, Saikin S 2019. Autonomous molecular design: then and now. ACS Appl. Mater. Interfaces 11:24825–36
    [Google Scholar]
  71. 71. 
    Boyce B, Uchic M 2019. Progress toward autonomous experimental systems for alloy development. MRS Bull. 44:273–80
    [Google Scholar]
  72. 72. 
    Toher C, Oses C, Curtarolo S 2019. Automated computation of materials properties. Materials Informatics: Methods, Tools and Applications ed. O Isayev, A Tropsha, S Curtarolo 181–222 Weinheim, Ger.: Wiley-VCH
    [Google Scholar]
  73. 73. 
    Smith J, Nebgen B, Lubbers N, Isayev O, Roitberg A 2018. Less is more: sampling chemical space with active learning. J. Chem. Phys. 148:241733
    [Google Scholar]
  74. 74. 
    Masubuchi S, Morimoto M, Morikawa S, Onodera M, Asakawa Y et al. 2018. Autonomous robotic searching and assembly of two-dimensional crystals to build van der Waals superlattices. Nat. Commun. 9:1413
    [Google Scholar]
  75. 75. 
    Rashidi M, Croshaw J, Mastel K, Tamura M, Hosseinzadeh H, Wolkow RA 2019. Autonomous atomic scale manufacturing through machine learning. arXiv:1902.08818 [cond-mat.mtrl-sci]
    [Google Scholar]
  76. 76. 
    Tabor D, Roch L, Saikin S, Kreisbeck C, Sheberla D et al. 2018. Accelerating the discovery of materials for clean energy in the era of smart automation. Nat. Rev. Mater. 3:5–20
    [Google Scholar]
  77. 77. 
    Zakutayev A, Wunder N, Schwarting M, Perkins J, White R et al. 2018. An open experimental database for exploring inorganic materials. Sci. Data 5:180053
    [Google Scholar]
  78. 78. 
    Li X, Zhang Y, Zhao H, Burkhart C, Brinson L, Chen W 2018. A transfer learning approach for microstructure reconstruction and structure-property predictions. Sci. Rep. 8:13461
    [Google Scholar]
  79. 79. 
    Zhang Y, Ling C 2018. A strategy to apply machine learning to small datasets in materials science. NPJ Comput. Mater. 4:25
    [Google Scholar]
  80. 80. 
    Lubbers N, Lookman T, Barros K 2017. Inferring low-dimensional microstructure representations using convolutional neural networks. Phys. Rev. 96:052111
    [Google Scholar]
  81. 81. 
    Butler K, Davies D, Cartwright H, Isayev O, Walsh A 2018. Machine learning for molecular and materials science. Nature 559:547–55
    [Google Scholar]
  82. 82. 
    Nyshadham C, Rupp M, Bekker B, Shapeev A, Mueller T et al. 2019. Machine-learned multi-system surrogate models for materials prediction. NPJ Comput. Mater. 5:51
    [Google Scholar]
  83. 83. 
    Kumar N, Rajagopalan P, Pankajakshan P, Bhattacharyya A, Sanyal S et al. 2019. Machine learning constrained with dimensional analysis and scaling laws: simple, transferable, and interpretable models of materials from small datasets. Chem. Mater. 31:314–21
    [Google Scholar]
  84. 84. 
    Sparks T, Kauwe S, Welker T 2018. Extracting knowledge from DFT: experimental band gap predictions through ensemble learning. ChemRxiv 7236029. https://doi.org/10.26434/chemrxiv.7236029.v1
    [Crossref] [Google Scholar]
  85. 85. 
    Paul A, Jha D, Al-Bahrani R, Liao W, Choudhary A, Agrawal A 2019. Transfer learning using ensemble neural networks for organic solar cell screening. arXiv:1903.03178 [cs.LG]
    [Google Scholar]
  86. 86. 
    Jennings P, Lysgaard S, Hummelshøj J, Vegge T, Bligaard T 2019. Genetic algorithms for computational materials discovery accelerated by machine learning. NPJ Comput. Mater. 5:46
    [Google Scholar]
  87. 87. 
    Ouyang R, Ahmetcik E, Carbogno C, Scheffler M, Ghiringhell L 2019. Simultaneous learning of several materials properties from incomplete databases with multi-task SISSO. J. Phys. Mater. 2:024002
    [Google Scholar]
  88. 88. 
    Suh C, Sieg S, Heying M, Oliver J, Maier W, Rajan K 2009. Visualization of high-dimensional combinatorial catalysis data. J. Comb. Chem. 11:385–92
    [Google Scholar]
  89. 89. 
    Rickman J 2018. Data analytics and parallel-coordinate materials property chart. NPJ Comput. Mater. 4:5
    [Google Scholar]
  90. 90. 
    Mueller T, Kusne A, Ramprasad R 2016. Machine learning in materials science: recent progress and emerging applications. Rev. Comput. Chem. 29:186–73
    [Google Scholar]
  91. 91. 
    Suh C, Biagioni D, Glynn S, Scharf J, Contreras M et al. 2011. Exploring high-dimensional data space: identifying optimal process conditions in photovoltaics. 2011 37th IEEE Photovoltaic Specialists Conference762–67 New York: IEEE
    [Google Scholar]
  92. 92. 
    Li Y 2006. Predicting materials properties and behavior using classification and regression trees. Mat. Sci. Eng. A 433:261–68
    [Google Scholar]
  93. 93. 
    Liu R, Kumar A, Chen Z, Agrawal A, Sundararaghavan V, Choudhary A 2015. A predictive machine learning approach for microstructure optimization and materials design. Sci. Rep. 5:11551
    [Google Scholar]
  94. 94. 
    Stanev V, Oses C, Kusne AG, Rodriguez E, Paglione J et al. 2018. Machine learning modeling of superconducting critical temperature. NPJ Comput. Mater. 4:29
    [Google Scholar]
  95. 95. 
    Dieb T, Ju S, Shiomi J, Tsuda K 2019. Monte Carlo tree search for materials design and discovery. MRS Commun. 9:532–36
    [Google Scholar]
  96. 96. 
    Joshi R, Eickholt J, Li L, Fornari M, Barone V, Peralta J 2019. Machine learning the voltage of electrode materials in metal-ion batteries. ACS Appl. Mater. Interfaces 11:18494–503
    [Google Scholar]
  97. 97. 
    Nouira A, Sokolovska N, Crivello J 2018. CrystalGAN: learning to discover crystallographic structures with generative adversarial networks. arXiv:1810.11203v3 [cs.LG]
    [Google Scholar]
  98. 98. 
    Jha D, Ward L, Paul A, Liao W, Choudhary A et al. 2018. ElemNet: deep learning the chemistry of materials from only elemental composition. Sci. Rep. 8:17593
    [Google Scholar]
  99. 99. 
    Frazier PI, Wang J 2016. Bayesian optimization for materials design. Information Science for Materials Discovery and Design ed. T Lookman, F Alexander, K Rajan 45–75 Cham, Switz.: Springer
    [Google Scholar]
  100. 100. 
    Gubernatis J, Lookman T 2018. Machine learning in materials design and discovery: examples from the present and suggestions for the future. Phys. Rev. Mater. 2:120301
    [Google Scholar]
  101. 101. 
    Li C, de Celis Leal DR, Rana S, Gupta S, Sutti A et al. 2017. Rapid Bayesian optimisation for synthesis of short polymer fiber materials. Sci. Rep. 7:5683
    [Google Scholar]
  102. 102. 
    Ling J, Hutchinson M, Antono E, Paradiso S, Meredig B 2017. High-dimensional materials and process optimization using data-driven experimental design with well-calibrated uncertainty estimates. Integr. Mater. Manuf. Innov. 6:207–17
    [Google Scholar]
  103. 103. 
    Ueno T, Rhone TD, Hou Z, Mizoguchi T, Tsuda K 2016. COMBO: an efficient Bayesian optimization library for materials science. Mater. Discov. 4:18–21
    [Google Scholar]
  104. 104. 
    Groves M, Pyzer-Knapp EO 2018. Efficient and scalable batch Bayesian optimization using K-means. arXiv:1806.01159 [stat.ML]
    [Google Scholar]
  105. 105. 
    Hernández-Lobato JM, Requeima J, Pyzer-Knapp EO, Aspuru-Guzik A 2017. Parallel and distributed Thompson sampling for large-scale accelerated exploration of chemical space. Proc. Mach. Learn. Res. 70:1470–79
    [Google Scholar]
  106. 106. 
    Talapatra A, Boluki S, Duong T, Qian X, Dougherty E, Arróyave R 2018. Autonomous efficient experiment design for materials discovery with Bayesian model averaging. Phys. Rev. Mater. 2:113803
    [Google Scholar]
  107. 107. 
    Silver D, Hubert T, Schrittwieser J, Antonoglou I, Lai M et al. 2018. A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play. Science 362:1140–44
    [Google Scholar]
  108. 108. 
    Senior AW, Evans R, Jumper J, Kirkpatrick J, Sifre L et al. 2020. Improved protein structure prediction using potentials from deep learning. Nature 577:706–10
    [Google Scholar]
  109. 109. 
    Popova M, Isayev O, Tropsha A 2018. Deep reinforcement learning for de novo drug design. Sci. Adv. 4:eaap7885
    [Google Scholar]
  110. 110. 
    Olivecrona M, Blaschke T, Engkvist O, Chen H 2017. Molecular de-novo design through deep reinforcement learning. J. Cheminform. 9:48
    [Google Scholar]
  111. 111. 
    Zhou Z, Li X, Zare RN 2017. Optimizing chemical reactions with deep reinforcement learning. ACS Cent. Sci. 3:1337–44
    [Google Scholar]
  112. 112. 
    Sanchez-Lengeling B, Aspuru-Guzik A 2018. Inverse molecular design using machine learning: generative models for matter engineering. Science 361:360–65
    [Google Scholar]
  113. 113. 
    Xu Y, Lin K, Wang S, Wang L, Cai C et al. 2019. Deep learning for molecular generation. Future Med. Chem. 11:567–97
    [Google Scholar]
  114. 114. 
    Doersch C 2016. Tutorial on variational autoencoders. arXiv:1606.05908 [stat.ML]
    [Google Scholar]
  115. 115. 
    Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D et al. 2014. Generative adversarial nets. Adv. Neural Inf. Process. Syst. 27:2672–80
    [Google Scholar]
  116. 116. 
    Jørgensen PB, Schmidt MN, Winther O 2018. Deep generative models for molecular science. Mol. Inform. 37:1700133
    [Google Scholar]
  117. 117. 
    Gómez-Bombarelli R, Wei JN, Duvenaud D, Hernández-Lobato JM, Sánchez-Lengeling B et al. 2018. Automatic chemical design using a data-driven continuous representation of molecules. ACS Cent. Sci. 4:268–76
    [Google Scholar]
  118. 118. 
    Blaschke T, Olivecrona M, Engkvist O, Bajorath J, Chen H 2018. Application of generative autoencoder in de novo molecular design. Mol. Inform. 37:1700123
    [Google Scholar]
  119. 119. 
    Kang S, Cho K 2019. Conditional molecular design with deep generative models. J. Chem. Inf. Model. 59:43–52
    [Google Scholar]
  120. 120. 
    Kadurin A, Aliper A, Kazennov A, Mamoshina P, Vanhaelen Q et al. 2016. The cornucopia of meaningful leads: applying deep adversarial autoencoders for new molecule development in oncology. Oncotarget 8:10883–90
    [Google Scholar]
  121. 121. 
    Samanta B, De A, Jana G, Chattaraj PK, Ganguly N, Gomez-Rodriguez M 2018. NeVAE: a deep generative model for molecular graphs. arXiv:1802.05283 [cs.LG]
    [Google Scholar]
  122. 122. 
    Jin W, Barzilay R, Jaakkola T 2018. Junction tree variational autoencoder for molecular graph generation. arXiv:1802.04364 [cs.LG]
    [Google Scholar]
  123. 123. 
    Winter R, Montanari F, Noé F, Clevert DA 2019. Learning continuous and data-driven molecular descriptors by translating equivalent chemical representations. Chem. Sci. 10:1692–701
    [Google Scholar]
  124. 124. 
    De Cao N, Kipf T 2018. MolGAN: an implicit generative model for small molecular graphs. arXiv:1805.11973 [stat.ML]
    [Google Scholar]
  125. 125. 
    Guimaraes GL, Sanchez-Lengeling B, Outeiral C, Farias PLC, Aspuru-Guzik A 2017. Objective-reinforced generative adversarial networks (ORGAN) for sequence generation models. arXiv:1705.10843 [stat.ML]
    [Google Scholar]
  126. 126. 
    Berardo E, Turcani L, Miklitz M, Jelfs KE 2018. An evolutionary algorithm for the discovery of porous organic cages. Chem. Sci. 9:8513–27
    [Google Scholar]
  127. 127. 
    Lysgaard S, Landis DD, Bligaard T, Vegge T 2014. Genetic algorithm procreation operators for alloy nanoparticle catalysts. Top. Catal. 57:33–39
    [Google Scholar]
  128. 128. 
    Patra TK, Meenakshisundaram V, Hung JH, Simmons DS 2017. Neural-network-biased genetic algorithms for materials design: evolutionary algorithms that learn. ACS Comb. Sci. 19:96–107
    [Google Scholar]
  129. 129. 
    Rickman J, Lookman T, Kalinin S 2019. Materials informatics: from the atomic-level to the continuum. Acta Mater. 168:473–510
    [Google Scholar]
  130. 130. 
    Maier W 2019. Early years of high-throughput experimentation and combinatorial approaches in catalysis and materials science. ACS Comb. Sci. 21:437–44
    [Google Scholar]
  131. 131. 
    Bock F, Aydin R, Cyron C, Huber N, Kalidindi S, Klusemann B 2019. A review of the application of machine learning and data mining approaches in continuum materials mechanics. Front. Mater. 6:110
    [Google Scholar]
  132. 132. 
    Elton DC, Boukouvala Z, Fuge MD, Chung PW 2019. Deep learning for molecular design—a review of the state of the art. Mol. Syst. Des. Eng. 4:828–49
    [Google Scholar]
  133. 133. 
    Agrawal A, Choudhary A 2019. Deep materials informatics: applications of deep learning in materials science. MRS Commun. 9:779–92
    [Google Scholar]
  134. 134. 
    Mater A, Coote M 2019. Deep learning in chemistry. J. Chem. Inf. Model. 59:2545–59
    [Google Scholar]
  135. 135. 
    Alberi K, Nardelli M, Zakutayev A, Mitas L, Curtarolo S et al. 2018. The 2019 materials by design roadmap. J. Phys. D 52:013001
    [Google Scholar]
  136. 136. 
    Li L, Yang Y, Zhang D, Ye ZG, Jesse S et al. 2018. Machine learning-enabled identification of material phase transitions based on experimental data: exploring collective dynamics in ferroelectric relaxors. Sci. Adv. 4:8672
    [Google Scholar]
  137. 137. 
    Rovinelli A, Sangid M, Proudhon H, Ludwig W 2018. Using machine learning and a data-driven approach to identify the small fatigue crack driving force in polycrystalline materials. NPJ Comput. Mater. 4:35
    [Google Scholar]
  138. 138. 
    Schwarzer M, Rogan B, Ruan Y, Song Z, Lee D et al. 2019. Learning to fail: predicting fracture evolution in brittle materials using recurrent graph convolutional neural networks. Comput. Mater. Sci. 162:322–32
    [Google Scholar]
  139. 139. 
    Zunger A 2018. Inverse design in search of materials with target functionalities. Nat. Rev. Chem. 2:0121
    [Google Scholar]
  140. 140. 
    Karcher S, Willighagen E, Rumble J, Ehrhart F, Evelo C et al. 2018. Integration among databases and data sets to support productive nanotechnology: challenges and recommendations. Nanoimpact 9:85–101
    [Google Scholar]
  141. 141. 
    Brown K, Spivak D, Wisnesky R 2019. Categorical data integration for computational science. Comput. Mater. Sci. 164:127–32
    [Google Scholar]
  142. 142. 
    Zhao H, Wang Y, Lin A, Hu B, Yan R et al. 2018. NanoMine schema: an extensible data representation for polymer nanocomposites. APL Mater. 6:111108
    [Google Scholar]
  143. 143. 
    Natl. Inst. Stand. Techol. 2017. High-Throughput Experimental Materials Collaboratory. National Institute of Standards and Technology https://www.nist.gov/programs-projects/high-throughput-experimental-materials-collaboratory
    [Google Scholar]
  144. 144. 
    Jacobsen M, Fourman J, Porter K, Wirrig E, Benedict M, Ward C 2016. Creating an integrated collaborative environment for materials research. Int. Mater. Manuf. Innov. 5:232–44
    [Google Scholar]
  145. 145. 
    Yang X, Wang Z, Zhao X, Song J, Yu C et al. 2018. MatCloud, a high-throughput computational materials infrastructure: present, future visions, and challenges. Chin. Phys. B 27:110301
    [Google Scholar]
/content/journals/10.1146/annurev-matsci-082019-105100
Loading
/content/journals/10.1146/annurev-matsci-082019-105100
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error