1932

Abstract

Advances in machine learning have impacted myriad areas of materials science, such as the discovery of novel materials and the improvement of molecular simulations, with likely many more important developments to come. Given the rapid changes in this field, it is challenging to understand both the breadth of opportunities and the best practices for their use. In this review, we address aspects of both problems by providing an overview of the areas in which machine learning has recently had significant impact in materials science, and then we provide a more detailed discussion on determining the accuracy and domain of applicability of some common types of machine learning models. Finally, we discuss some opportunities and challenges for the materials community to fully utilize the capabilities of machine learning.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-matsci-070218-010015
2020-07-01
2024-03-29
Loading full text...

Full text loading...

/deliver/fulltext/matsci/50/1/annurev-matsci-070218-010015.html?itemId=/content/journals/10.1146/annurev-matsci-070218-010015&mimeType=html&fmt=ahah

Literature Cited

  1. 1. 
    Silver D, Hubert T, Schrittwieser J, Antonoglou I, Lai M et al. 2017. Mastering chess and shogi by self-play with a general reinforcement learning algorithm. arXiv:1712.01815 [cs.AI]
  2. 2. 
    Silver D, Huang A, Maddison CJ, Guez A, Sifre L et al. 2016. Mastering the game of Go with deep neural networks and tree search. Nature 529:7587484–89
    [Google Scholar]
  3. 3. 
    Morav M, Schmid M, Burch N, Lisý V, Morrill D et al. 2017. DeepStack: expert-level artificial intelligence in heads-up no-limit poker. Science 356:6337508–13
    [Google Scholar]
  4. 4. 
    Brown N, Sandholm T. 2018. Superhuman AI for heads-up no-limit poker: Libratus beats top professionals. Science 359:6374418–24
    [Google Scholar]
  5. 5. 
    Ferrucci D, Brown E, Chu-carroll J, Fan J, Gondek D et al. 2010. Building Watson: an overview of the DeepQA project. AI Mag 31:359–79
    [Google Scholar]
  6. 6. 
    Jordan MI, Mitchell TM. 2015. Machine learning: trends, perspectives, and prospects. Science 349:6245255–60
    [Google Scholar]
  7. 7. 
    Jouppi NP, Young C, Patil N, Patterson D, Agrawal G et al. 2017. In-datacenter performance analysis of a tensor processing unit. Proceedings of the 44th Annual International Symposium on Computer Architecture1–12 New York: Assoc. Comput. Mach.
    [Google Scholar]
  8. 8. 
    Olson GB. 2000. Designing a new material world. Science 288:5468993–98
    [Google Scholar]
  9. 9. 
    Panchal JH, Kalidindi SR, McDowell DL 2013. Key computational modeling issues in integrated computational materials engineering. Comput.-Aided Des. 45:14–25
    [Google Scholar]
  10. 10. 
    McDowell DL, Kalidindi SR. 2016. The materials innovation ecosystem: a key enabler for the Materials Genome Initiative. MRS Bull 41:4326–35
    [Google Scholar]
  11. 11. 
    Kailil T, Wadia C. 2011. Materials Genome Initiative for global competitiveness White Pap., Nat. Sci. Technol. Counc. Washington, DC: https://www.mgi.gov/sites/default/files/documents/materials_genome_initiative-final.pdf
  12. 12. 
    de Pablo JJ, Jackson NE, Webb MA, Chen LQ, Moore JE et al. 2019. New frontiers for the Materials Genome Initiative. npj Comput. Mater. 5:41
    [Google Scholar]
  13. 13. 
    Ye W, Chen C, Wang Z, Chu I-H, Ong SP 2018. Deep neural networks for accurate predictions of crystal stability. Nat. Commun. 9:3800
    [Google Scholar]
  14. 14. 
    Li W, Jacobs R, Morgan D 2018. Predicting the thermodynamic stability of perovskite oxides using machine learning models. Comput. Mater. Sci. 150:454–63
    [Google Scholar]
  15. 15. 
    Balachandran PV, Emery AA, Gubernatis JE, Lookman T, Wolverton C, Zunger A 2018. Predictions of new ABO3 perovskite compounds by combining machine learning and density functional theory. Phys. Rev. Mater. 2:4043802
    [Google Scholar]
  16. 16. 
    Faber FA, Lindmaa A, von Lilienfeld OA, Armiento R 2016. Machine learning energies of 2 million elpasolite (ABC2D6) crystals. Phys. Rev. Lett. 117:13135502
    [Google Scholar]
  17. 17. 
    Hautier G, Fischer CC, Jain A, Mueller T, Ceder G 2010. Finding nature's missing ternary oxide compounds using machine learning and density functional theory. Chem. Mater. 22:123762–67
    [Google Scholar]
  18. 18. 
    Meredig B, Agrawal A, Kirklin S, Saal JE, Doak JW et al. 2014. Combinatorial screening for new materials in unconstrained composition space with machine learning. Phys. Rev. B 89:994104
    [Google Scholar]
  19. 19. 
    Stanev V, Oses C, Kusne AG, Rodriguez E, Takeuchi I et al. 2018. Machine learning modeling of superconducting critical temperature. npj Comput. Mater. 4:29
    [Google Scholar]
  20. 20. 
    Meredig B, Antono E, Church C, Hutchinson M, Ling J et al. 2018. Can machine learning identify the next high-temperature superconductor? Examining extrapolation performance for materials discovery. Mol. Syst. Des. Eng. 3:5819–25
    [Google Scholar]
  21. 21. 
    Seko A, Maekawa T, Tsuda K, Tanaka I 2014. Machine learning with systematic density-functional theory calculations: application to melting temperatures of single- and binary-component solids. Phys. Rev. B 89:554303
    [Google Scholar]
  22. 22. 
    Mannodi-Kanakkithodi A, Pilania G, Huan TD, Lookman T, Ramprasad R 2016. Machine learning strategy for accelerated design of polymer dielectrics. Sci. Rep. 6:20952
    [Google Scholar]
  23. 23. 
    Kim C, Pilania G, Ramprasad R 2016. Machine learning assisted predictions of intrinsic dielectric breakdown strength of ABX3 perovskites. J. Phys. Chem. C 120:14575–80
    [Google Scholar]
  24. 24. 
    Kim K, Ward L, He J, Krishna A, Agrawal A, Wolverton C 2018. Machine-learning-accelerated high-throughput materials screening: discovery of novel quaternary Heusler compounds. Phys. Rev. Mater. 2:12123801
    [Google Scholar]
  25. 25. 
    Legrain F, Carrete J, Van Roekeghem A, Madsen GKH, Mingo N 2018. Materials screening for the discovery of new half-Heuslers: machine learning versus ab initio methods. J. Phys. Chem. B 122:2625–32
    [Google Scholar]
  26. 26. 
    Ward L, O'Keeffe SC, Stevick J, Jelbert GR, Aykol M, Wolverton C 2018. A machine learning approach for engineering bulk metallic glass alloys. Acta Mater 159:102–11
    [Google Scholar]
  27. 27. 
    Pilania G, Gubernatis JE, Lookman T 2017. Multi-fidelity machine learning models for accurate bandgap predictions of solids. Comput. Mater. Sci. 129:156–63
    [Google Scholar]
  28. 28. 
    Ramprasad R, Mannodi-Kanakkithodi A, Lookman T, Pilania G, Uberuaga BP, Gubernatis JE 2016. Machine learning bandgaps of double perovskites. Sci. Rep. 6:19375
    [Google Scholar]
  29. 29. 
    Lee J, Seko A, Shitara K, Nakayama K, Tanaka I 2016. Prediction model of band gap for inorganic compounds by combination of density functional theory calculations and machine learning techniques. Phys. Rev. B 93:11115104
    [Google Scholar]
  30. 30. 
    Zhuo Y, Tehrani AM, Brgoch J 2018. Predicting the band gaps of inorganic solids by machine learning. 971668–73
  31. 31. 
    Lu S, Zhou Q, Ouyang Y, Guo Y, Li Q, Wang J 2018. Accelerated discovery of stable lead-free hybrid organic-inorganic perovskites via machine learning. Nat. Commun. 9:3405
    [Google Scholar]
  32. 32. 
    Li Z, Xu Q, Sun Q, Hou Z, Yin W-J 2019. Thermodynamic stability landscape of halide double perovskites via high-throughput computing and machine learning. Adv. Funct. Mater. 29:91807280
    [Google Scholar]
  33. 33. 
    Im J, Lee S, Ko T-W, Kim HW, Hyon Y, Chang H 2019. Identifying Pb-free perovskites for solar cells by machine learning. npj Comput. Mater. 5:37
    [Google Scholar]
  34. 34. 
    Wu H, Lorenson A, Anderson B, Witteman L, Wu H et al. 2017. Robust FCC solute diffusion predictions from ab-initio machine learning methods. Comput. Mater. Sci. 134:160–65
    [Google Scholar]
  35. 35. 
    Lu H-J, Zou N, Jacobs R, Afflerbach B, Lu X-G, Morgan D 2019. Error assessment and optimal cross-validation approaches in machine learning applied to impurity diffusion. Comput. Mater. Sci. 169:109075
    [Google Scholar]
  36. 36. 
    Liu Y, Jacobs R, Lin S, Morgan D 2019. Exploring effective charge in electromigration using machine learning. MRS Commun 9:567–75
    [Google Scholar]
  37. 37. 
    Pilania G, McClellan KJ, Stanek CR, Uberuaga BP 2018. Physics-informed machine learning for inorganic scintillator discovery. J. Chem. Phys. 148:24241729
    [Google Scholar]
  38. 38. 
    Yuan R, Liu Z, Balachandran PV, Xue D, Zhou Y et al. 2018. Accelerated discovery of large electrostrains in BaTiO3-based piezoelectrics using active learning. Adv. Mater. 30:71702884
    [Google Scholar]
  39. 39. 
    Ramprasad R, Batra R, Pilania G, Mannodi-Kanakkithodi A, Kim C 2017. Machine learning and materials informatics: recent applications and prospects. npj Comput. Mater. 3:54
    [Google Scholar]
  40. 40. 
    Jha D, Ward L, Yang Z, Wolverton C, Foster I et al. 2019. IRNet: a general purpose deep residual regression framework for materials discovery. Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining2385–93 New York: Assoc. Comput. Mach.
    [Google Scholar]
  41. 41. 
    Mueller T, Kusne AG, Ramprasad R 2016. Machine learning in materials science: recent progress and emerging applications. In Reviews in Computational Chemistry, Vol. 29 AL Parrill, KB Lipkowitz 186–273 Hoboken, NJ: Wiley
    [Google Scholar]
  42. 42. 
    Raschka S, Mirjalili V. 2017. Python Machine Learning: Machine Learning and Deep Learning with Python, scikit-learn, and TensorFlow Birmingham, UK: Packt, 2nd ed..
  43. 43. 
    Alpaydin E. 2014. Introduction to Machine Learning Boston: MIT Press
  44. 44. 
    Goodfellow I, Bengio Y, Courville A 2016. Deep Learning Cambridge, MA: MIT Press
  45. 45. 
    Nantasenamat C, Isarankura-Na-Ayudhya C, Naenna T, Prachayasittikul V 2009. A practical overview of quantitative structure-activity relationship. EXCLI J 8:74–88
    [Google Scholar]
  46. 46. 
    Karelson M, Lobanov VS, Katritzky AR 1996. Quantum-chemical descriptors in QSAR/QSPR studies. Chem. Rev. 96:31027–44
    [Google Scholar]
  47. 47. 
    Debbichi L, Lee S, Cho H, Rappe AM, Hong KH et al. 2018. Mixed valence perovskite Cs2Au2I6: a potential material for thin-film Pb-free photovoltaic cells with ultrahigh efficiency. Adv. Mater. 30:121707001
    [Google Scholar]
  48. 48. 
    Rouet-Leduc B, Barros K, Lookman T, Humphreys CJ 2016. Optimisation of GaN LEDs and the reduction of efficiency droop using active machine learning. Sci. Rep. 6:24862
    [Google Scholar]
  49. 49. 
    Rouet-Leduc B, Hulbert C, Barros K, Lookman T, Humphreys CJ 2017. Automatized convergence of optoelectronic simulations using active machine learning. Appl. Phys. Lett. 111:443506
    [Google Scholar]
  50. 50. 
    Bassman L, Rajak P, Kalia RK, Nakano A, Sha F et al. 2018. Active learning for accelerated design of layered materials. npj Comput. Mater. 4:74
    [Google Scholar]
  51. 51. 
    Smith JS, Nebgen B, Lubbers N, Isayev O, Roitberg AE 2018. Less is more: sampling chemical space with active learning. J. Chem. Phys. 148:24241733
    [Google Scholar]
  52. 52. 
    Lookman T, Balachandran PV, Xue D, Yuan R 2019. Active learning in materials science with emphasis on adaptive sampling using uncertainties for targeted design. npj Comput. Mater. 5:21
    [Google Scholar]
  53. 53. 
    Lookman T, Balachandran PV, Xue D, Hogden J, Theiler J 2017. Statistical inference and adaptive design for materials discovery. Curr. Opin. Solid State Mater. Sci. 21:3121–28
    [Google Scholar]
  54. 54. 
    Kim C, Chandrasekaran A, Jha A, Ramprasad R 2019. Active-learning and materials design: the example of high glass transition temperature polymers. MRS Commun 9:3860–66
    [Google Scholar]
  55. 55. 
    Granda JM, Donina L, Dragone V, Long DL, Cronin L 2018. Controlling an organic synthesis robot with machine learning to search for new reactivity. Nature 559:7714377–81
    [Google Scholar]
  56. 56. 
    Soldatova LN, Clare A, Sparkes A, King RD 2006. An ontology for a robot scientist. Bioinformatics 22:14464–71
    [Google Scholar]
  57. 57. 
    Talapatra A, Boluki S, Duong T, Qian X, Dougherty E, Arróyave R 2018. Autonomous efficient experiment design for materials discovery with Bayesian model averaging. Phys. Rev. Mater. 2:11113803
    [Google Scholar]
  58. 58. 
    Tabor DP, Roch LM, Saikin SK, Kreisbeck C, Sheberla D et al. 2018. Accelerating the discovery of materials for clean energy in the era of smart automation. Nat. Rev. Mater. 3:55–20
    [Google Scholar]
  59. 59. 
    Nikolaev P, Hooper D, Webber F, Rao R, Decker K et al. 2016. Autonomy in materials research: a case study in carbon nanotube growth. npj Comput. Mater. 2:16031
    [Google Scholar]
  60. 60. 
    Häse F, Roch LM, Aspuru-Guzik A 2019. Next-generation experimentation with self-driving laboratories. Trends Chem 1:3282–91
    [Google Scholar]
  61. 61. 
    MacLeod BP, Parlane FGL, Morrissey TD, Häse F, Roch LM et al. 2019. Self-driving laboratory for accelerated discovery of thin-film materials. arXiv:1906.05398 [physics.app-ph]
  62. 62. 
    Duros V, Grizou J, Xuan W, Hosni Z, Long DL et al. 2017. Human versus robots in the discovery and crystallization of gigantic polyoxometalates. Angew. Chemie Int. Ed. 56:3610815–20
    [Google Scholar]
  63. 63. 
    King RD, Oliver SG, Rowland J, Soldatova LN, Whelan KE et al. 2009. The automation of science. Science 324:592385–89
    [Google Scholar]
  64. 64. 
    Sparkes A, Aubrey W, Byrne E, Clare A, Khan MN et al. 2010. Towards robot scientists for autonomous scientific discovery. Autom. Exp. 2:1
    [Google Scholar]
  65. 65. 
    Zunger A. 2018. Inverse design in search of materials with target functionalities. Nat. Rev. Chem. 2:0121
    [Google Scholar]
  66. 66. 
    Arróyave R, McDowell DL. 2019. Systems approaches to materials design: past, present, and future. Annu. Rev. Mater. Res. 49:10326
    [Google Scholar]
  67. 67. 
    Alberi K, Nardelli MB, Zakutayev A, Mitas L, Curtarolo S et al. 2019. The 2019 materials by design roadmap. J. Phys. D Appl. Phys. 52:113001
    [Google Scholar]
  68. 68. 
    Sanchez-Lengeling B, Aspuru-Guzik A. 2018. Inverse molecular design using machine learning: generative models for matter engineering. Science 361:6400360–65
    [Google Scholar]
  69. 69. 
    Nouira A, Crivello J-C, Sokolovska N 2019. CrystalGAN: learning to discover crystallographic structures with generative adversarial networks. arXiv:1810.11203 [cs.LG]
  70. 70. 
    Sanchez-Lengeling B, Outeiral C, Guimaraes GL, Aspuru-Guzik A 2017. Optimizing distributions over molecular space. An Objective-Reinforced Generative Adversarial Network for Inverse-design Chemistry (ORGANIC). ChemRxiv https://doi.org/10.26434/chemrxiv.5309668.v3
    [Crossref] [Google Scholar]
  71. 71. 
    Putin E, Asadulaev A, Ivanenkov Y, Aladinskiy V, Sanchez-Lengeling B et al. 2018. Reinforced adversarial neural computer for de novo molecular design. J. Chem. Inf. Model. 58:61194–204
    [Google Scholar]
  72. 72. 
    Voyles PM. 2017. Informatics and data science in materials microscopy. Curr. Opin. Solid State Mater. Sci. 21:3141–58
    [Google Scholar]
  73. 73. 
    Dimiduk DM, Holm EA, Niezgoda SR 2018. Perspectives on the impact of machine learning, deep learning, and artificial intelligence on materials, processes, and structures engineering. Integr. Mater. Manuf. Innov. 7:157–72
    [Google Scholar]
  74. 74. 
    Li W, Field KG, Morgan D 2018. Automated defect analysis in electron microscopic images. npj Comput. Mater. 4:36
    [Google Scholar]
  75. 75. 
    Ziatdinov M, Dyck O, Maksov A, Li X, Sang X et al. 2017. Deep learning of atomically resolved scanning transmission electron microscopy images: chemical identification and tracking local transformations. ACS Nano 11:1212742–52
    [Google Scholar]
  76. 76. 
    Park WB, Chung J, Jung J, Sohn K, Singh SP et al. 2017. Classification of crystal structure using a convolutional neural network. IUCrJ 4:486–94
    [Google Scholar]
  77. 77. 
    Stein HS, Guevarra D, Newhouse PF, Soedarmadji E, Gregoire JM 2019. Machine learning of optical properties of materials—predicting spectra from images and images from spectra. Chem. Sci. 10:147–55
    [Google Scholar]
  78. 78. 
    Combs A, Maldonis JJ, Feng J, Xu Z, Voyles PM, Morgan D 2019. Fast approximate STEM image simulations from a machine learning model. Adv. Struct. Chem. Imaging 5:2
    [Google Scholar]
  79. 79. 
    Mikolov T, Chen K, Corrado G, Dean J 2013. Efficient estimation of word representations in vector space. arXiv:1301.3781 [cs.CL]
  80. 80. 
    Pennington J, Socher R, Manning CD 2014. GloVe: global vectors for word representation. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) A Moschitti, B Pang, W Daelemans 1532–43 Doha, Qatar: Assoc. Comput. Ling.
    [Google Scholar]
  81. 81. 
    dos Santos CN, Gatti M, dos Santos CN, Gatti M 2014. Deep convolutional neural networks for sentiment analysis of short texts. Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers J Tsujii, J Hajic 69–78 Dublin: Assoc. Comput. Ling.
    [Google Scholar]
  82. 82. 
    Westergaard D, Stærfeldt HH, Tønsberg C, Jensen LJ, Brunak S 2018. A comprehensive and quantitative comparison of text-mining in 15 million full-text articles versus their corresponding abstracts. PLOS Comput. Biol. 14:2e1005962
    [Google Scholar]
  83. 83. 
    Rebholz-Schuhmann D, Oellrich A, Hoehndorf R 2012. Text-mining solutions for biomedical research: enabling integrative biology. Nat. Rev. Genet. 13:12829–39
    [Google Scholar]
  84. 84. 
    Yandell MD, Majoros WH. 2002. Genomics and natural language processing. Nat. Rev. Genet. 3:8601–10
    [Google Scholar]
  85. 85. 
    Meystre SM, Savova GK, Kipper-Schuler KC, Hurdle JF 2008. Extracting information from textual documents in the electronic health record: a review of recent research. Yearb. Med. Inform. 17:1128–44
    [Google Scholar]
  86. 86. 
    Evans JA, Aceves P. 2016. Machine translation: mining text for social theory. Annu. Rev. Sociol. 42:21–50
    [Google Scholar]
  87. 87. 
    Tshitoyan V, Dagdelen J, Weston L, Dunn A, Rong Z et al. 2019. Unsupervised word embeddings capture latent knowledge from materials science literature. Nature 571:776395–98
    [Google Scholar]
  88. 88. 
    Tshitoyan V, Dagdelen J, Weston L, Dunn A, Rong Z et al. 2019. Supplementary materials for “Unsupervised word embeddings capture latent knowledge from materials science literature.”. Nature 571:95–98 https://github.com/materialsintelligence/mat2vec
    [Google Scholar]
  89. 89. 
    Kim E, Huang K, Tomala A, Matthews S, Strubell E et al. 2017. Machine-learned and codified synthesis parameters of oxide materials. Sci. Data 4:170127
    [Google Scholar]
  90. 90. 
    Mysore S, Jensen Z, Kim E, Huang K, Chang H-S et al. 2019. The materials science procedural text corpus: annotating materials synthesis procedures with shallow semantic structures. arXiv:1905.06939 [cs.CL]
  91. 91. 
    Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L et al. 2017. Attention is all you need. Advances in Neural Information Processing Systems 30 I Guyon, UV Luxburg, S Bengio, H Wallach, R Fergus et al.5999–6009 Long Beach, CA: Neural Inf. Process. Syst. Found.
    [Google Scholar]
  92. 92. 
    Devlin J, Chang M-W, Lee K, Toutanova K 2019. BERT: pre-training of deep bidirectional transformers for language understanding. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies J Burstein, C Doran, T Solorio 4171–86 Minneapolis, MN: Assoc. Comput. Ling.
    [Google Scholar]
  93. 93. 
    Strubell E, Verga P, Belanger D, McCallum A 2017. Fast and accurate entity recognition with iterated dilated convolutions. Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing M Palmer, R Hwa, S Riedel 2670–80 Copenhagen, Den.: Assoc. Comput. Ling.
    [Google Scholar]
  94. 94. 
    Honnibal M, Johnson M. 2015. An improved non-monotonic transition system for dependency parsing. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing L Màrquez, C Callison-Burch, J Su 1373–78 Lisbon, Port.: Assoc. Comput. Ling.
    [Google Scholar]
  95. 95. 
    Swain MC, Cole JM. 2016. ChemDataExtractor: a toolkit for automated extraction of chemical information from the scientific literature. J. Chem. Inf. Model. 56:101894–904
    [Google Scholar]
  96. 96. 
    Weston L, Tshitoyan V, Dagdelen J, Kononova O, Persson KA et al. 2019. Named entity recognition and normalization applied to large-scale information extraction from the materials science literature. ChemRxiv. https://doi.org/10.26434/chemrxiv.8226068.v1
    [Crossref] [Google Scholar]
  97. 97. 
    Kim E, Huang K, Saunders A, McCallum A, Ceder G, Olivetti E 2017. Materials synthesis insights from scientific literature via text extraction and machine learning. Chem. Mater. 29:219436–44
    [Google Scholar]
  98. 98. 
    Kim E, Jensen Z, van Grootel A, Huang K, Staib M et al. 2018. Inorganic materials synthesis planning with literature-trained neural networks. arXiv:1901.00032 [cond-mat.mtrl-sci]
  99. 99. 
    Kim E, Huang K, Kononova O, Ceder G, Olivetti E 2019. Distilling a materials synthesis ontology. Matter 1:18–12
    [Google Scholar]
  100. 100. 
    Mysore S, Kim E, Strubell E, Liu A, Chang H-S et al. 2017. Automatically extracting action graphs from materials science synthesis procedures. arXiv:1711.06872 [cs.CL]
  101. 101. 
    Beta Writ 2019. Lithium-Ion Batteries: A Machine-Generated Summary of Current Research Cham, Switz.: Springer Nat. Switz. AG
  102. 102. 
    Botu V, Batra R, Chapman J, Ramprasad R 2017. Machine learning force fields: construction, validation, and outlook. J. Phys. Chem. C 121:1511–22
    [Google Scholar]
  103. 103. 
    Botu V, Ramprasad R. 2015. Adaptive machine learning framework to accelerate ab initio molecular dynamics. Int. J. Quantum Chem. 115:161074–83
    [Google Scholar]
  104. 104. 
    Li Z, Kermode JR, De Vita A 2015. Molecular dynamics with on-the-fly machine learning of quantum-mechanical forces. Phys. Rev. Lett. 114:996405
    [Google Scholar]
  105. 105. 
    Huan TD, Batra R, Chapman J, Krishnan S, Chen L, Ramprasad R 2017. A universal strategy for the creation of machine learning-based atomistic force fields. npj Comput. Mater. 3:37
    [Google Scholar]
  106. 106. 
    Behler J. 2017. First principles neural network potentials for reactive simulations of large molecular and condensed systems. Angew. Chemie Int. Ed. 56:2–15
    [Google Scholar]
  107. 107. 
    Rupp M. 2015. Machine learning for quantum mechanics in a nutshell. Int. J. Quantum Chem. 115:161058–73
    [Google Scholar]
  108. 108. 
    Chan H, Narayanan B, Cherukara MJ, Sen FG, Sasikumar K et al. 2019. Machine learning classical interatomic potentials for molecular dynamics from first-principles training data. J. Phys. Chem. C 123:126941–57
    [Google Scholar]
  109. 109. 
    Bartók AP, De S, Poelking C, Bernstein N, Kermode JR et al. 2017. Machine learning unifies the modeling of materials and molecules. Sci. Adv. 3:12e1701816
    [Google Scholar]
  110. 110. 
    Artrith N, Morawietz T. 2011. High-dimensional neural-network potentials for multicomponent systems: applications to zinc oxide. Phys. Rev. B 83:15153101
    [Google Scholar]
  111. 111. 
    Artrith N, Urban A, Ceder G 2017. Efficient and accurate machine-learning interpolation of atomic energies in compositions with many species. Phys. Rev. B 96:114112
    [Google Scholar]
  112. 112. 
    Nie X, Chien P, Morgan D, Kaczmarowski A 2019. A statistical method for emulation of computer models with invariance-preserving properties, with application to structural energy prediction. J. Am. Stat. Assoc. https://doi.org/10.1080/01621459.2019.1654876
    [Crossref] [Google Scholar]
  113. 113. 
    Behler J. 2016. Perspective: machine learning potentials for atomistic simulations. J. Chem. Phys. 145:17170901
    [Google Scholar]
  114. 114. 
    Ward L, Wolverton C. 2017. Atomistic calculations and materials informatics: a review. Curr. Opin. Solid State Mater. Sci. 21:3167–76
    [Google Scholar]
  115. 115. 
    Behler J, Parrinello M. 2007. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett. 98:14146401
    [Google Scholar]
  116. 116. 
    Peterson AA, Christensen R, Khorshidi A 2017. Addressing uncertainty in atomistic machine learning. Phys. Chem. Chem. Phys. 19:1810978–85
    [Google Scholar]
  117. 117. 
    Maurer RJ, Freysoldt C, Reilly AM, Brandenburg JG, Hofmann OT et al. 2019. Advances in density-functional calculations for materials modeling. Annu. Rev. Mater. Res. 49:1–30
    [Google Scholar]
  118. 118. 
    Nagai R, Akashi R, Sasaki S, Tsuneyuki S 2018. Neural-network Kohn-Sham exchange-correlation potential and its out-of-training transferability. J. Chem. Phys. 148:24241737
    [Google Scholar]
  119. 119. 
    Bogojeski M, Vogt-Maranto L, Tuckerman ME, Müller KR, Burke K 2019. Density functionals with quantum chemical accuracy: from machine learning to molecular dynamics. ChemRxiv. https://doi.org/10.26434/chemrxiv.8079917.v1
    [Crossref] [Google Scholar]
  120. 120. 
    Snyder JC, Rupp M, Hansen K, Müller KR, Burke K 2012. Finding density functionals with machine learning. Phys. Rev. Lett. 108:25253002
    [Google Scholar]
  121. 121. 
    Li L, Snyder JC, Pelaschier IM, Huang J, Niranjan UN et al. 2016. Understanding machine-learned density functionals. Int. J. Quantum Chem. 116:11819–33
    [Google Scholar]
  122. 122. 
    Nelson J, Tiwari R, Sanvito S 2019. Machine learning density functional theory for the Hubbard model. Phys. Rev. B 99:7075132
    [Google Scholar]
  123. 123. 
    Mills K, Spanner M, Tamblyn I 2017. Deep learning and the Schrödinger equation. Phys. Rev. A 96:442113
    [Google Scholar]
  124. 124. 
    Lei X, Medford AJ. 2019. Design and analysis of machine learning exchange-correlation functionals via rotationally invariant convolutional descriptors. Phys. Rev. Mater. 3:663801
    [Google Scholar]
  125. 125. 
    Ryczko K, Strubbe D, Tamblyn I 2018. Deep learning and density functional theory. arXiv:1811.08928 [cond-mat.mtrl-sci]
  126. 126. 
    Kajita S, Ohba N, Jinnouchi R, Asahi R 2017. A universal 3D voxel descriptor for solid-state material informatics with deep convolutional neural networks. Sci. Rep. 7:16911
    [Google Scholar]
  127. 127. 
    Brockherde F, Vogt L, Li L, Tuckerman ME, Burke K, Müller KR 2017. Bypassing the Kohn-Sham equations with machine learning. Nat. Commun. 8:872
    [Google Scholar]
  128. 128. 
    Bogojeski M, Brockherde F, Vogt-Maranto L, Li L, Tuckerman ME et al. 2018. Efficient prediction of 3D electron densities using machine learning. arXiv:1811.06255 [physics.comp-ph]
  129. 129. 
    Sinitskiy AV, Pande VS. 2018. Deep neural network computes electron densities and energies of a large set of organic molecules faster than density functional theory (DFT). arXiv:1809.02723 [physics.chem-ph]
  130. 130. 
    Schmidt J, Marques MRG, Botti S, Marques MAL 2019. Recent advances and applications of machine learning in solid-state materials science. npj Comput. Mater. 5:83
    [Google Scholar]
  131. 131. 
    Ward L, Agrawal A, Choudhary A, Wolverton C 2016. A general-purpose machine learning framework for predicting properties of inorganic materials. npj Comput. Mater. 2:16028
    [Google Scholar]
  132. 132. 
    Kausar S, Falcao AO. 2018. An automated framework for QSAR model building. J. Cheminform. 10:11
    [Google Scholar]
  133. 133. 
    Behler J. 2011. Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys. 134:774106
    [Google Scholar]
  134. 134. 
    Barok AP, Kondor R, Csanyi G 2013. On representing chemical environments. Phys. Rev. B 87:16184115
    [Google Scholar]
  135. 135. 
    Schütt KT, Glawe H, Brockherde F, Sanna A, Müller KR, Gross EKU 2014. How to represent crystal structures for machine learning: towards fast prediction of electronic properties. Phys. Rev. B 89:20205118
    [Google Scholar]
  136. 136. 
    Hansen K, Biegler F, Ramakrishnan R, Pronobis W, von Lilienfeld OA et al. 2015. Machine learning predictions of molecular properties: accurate many-body potentials and nonlocality in chemical space. J. Phys. Chem. Lett. 6:122326–31
    [Google Scholar]
  137. 137. 
    Huang B, von Lilienfeld OA 2016. Communication: understanding molecular representations in machine learning: the role of uniqueness and target similarity. J. Chem. Phys. 145:16161102
    [Google Scholar]
  138. 138. 
    Huo H, Rupp M. 2017. Unified representation of molecules and crystals for machine learning. arXiv:1704.06439 [physics.chem-ph]
  139. 139. 
    Ward L, Dunn A, Faghaninia A, Zimmermann NER, Bajaj S et al. 2018. Matminer: an open source toolkit for materials data mining. Comput. Mater. Sci. 152:60–69
    [Google Scholar]
  140. 140. 
    Park CW, Wolverton C. 2019. Developing an improved Crystal Graph Convolutional Neural Network framework for accelerated materials discovery. arXiv:1906.05267 [physics.comp-ph]
  141. 141. 
    Korolev V, Mitrofanov A, Korotcov A, Tkachenko V 2019. Graph convolutional neural networks as “general-purpose” property predictors: the universality and limits of applicability. arXiv:1906.06256 [physics.comp-ph]
  142. 142. 
    Chen C, Ye W, Zuo Y, Zheng C, Ong SP 2019. Graph networks as a universal machine learning framework for molecules and crystals. Chem. Mater. 31:93564–72
    [Google Scholar]
  143. 143. 
    DeepChem 2017. DeepChem https://deepchem.io/
  144. 144. 
    He K, Zhang X, Ren S, Sun J 2016. Deep residual learning for image recognition. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)770–78 Las Vegas, NV: IEEE
    [Google Scholar]
  145. 145. 
    Ren S, He K, Girshick R, Sun J 2015. Faster R-CNN: towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 39:61137–49
    [Google Scholar]
  146. 146. 
    Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE 2017. Neural message passing for quantum chemistry. In Proceedings of the 34th International Conference on Machine Learning 70:1263–72 Sydney: JMLR
    [Google Scholar]
  147. 147. 
    Schütt KT, Sauceda HE, Kindermans PJ, Tkatchenko A, Müller KR 2018. SchNet – a deep learning architecture for molecules and materials. J. Chem. Phys. 148:24241722
    [Google Scholar]
  148. 148. 
    Schütt KT, Kessel P, Gastegger M, Nicoli KA, Tkatchenko A, Müller KR 2019. SchNetPack: a deep learning toolbox for atomistic systems. J. Chem. Theory Comput. 15:1448–55
    [Google Scholar]
  149. 149. 
    Xie T, Grossman JC. 2018. Hierarchical visualization of materials space with graph convolutional neural networks. J. Chem. Phys. 149:17174111
    [Google Scholar]
  150. 150. 
    Xie T, Grossman JC. 2018. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys. Rev. Lett. 120:14145301
    [Google Scholar]
  151. 151. 
    Lecun Y, Bengio Y, Hinton G 2015. Deep learning. Nature 521:7553436–44
    [Google Scholar]
  152. 152. 
    Sheridan RP. 2013. Time-split cross-validation as a method for estimating the goodness of prospective prediction. J. Chem. Inf. Model. 53:4783–90
    [Google Scholar]
  153. 153. 
    Cawley GC, Talbot NLC. 2010. On over-fitting in model selection and subsequent selection bias in performance evaluation. J. Mach. Learn. Res. 11:2079–2107
    [Google Scholar]
  154. 154. 
    Schwaighofer A, Schroeter T, Mika S, Blanchard G 2009. How wrong can we get? A review of machine learning approaches and error bars. Comb. Chem. High Throughput Screen. 12:5453–68
    [Google Scholar]
  155. 155. 
    Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B et al. 2011. Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12:2825–30
    [Google Scholar]
  156. 156. 
    Jacobs R, Mayeshiba T, Afflerbach B, Miles L, Williams M et al. 2019. The Materials Simulation Toolkit for Machine Learning (MAST-ML): an automated open source toolkit to accelerate data-driven materials research. Comput. Mater. Sci. 176:109544
    [Google Scholar]
  157. 157. 
    Molinaro AM, Simon R, Pfeiffer RM 2005. Prediction error estimation: a comparison of resampling methods. Bioinformatics 21:153301–7
    [Google Scholar]
  158. 158. 
    Ren F, Ward L, Williams T, Laws KJ, Wolverton C et al. 2018. Accelerated discovery of metallic glasses through iteration of machine learning and high-throughput experiments. Sci. Adv. 4:4eaaq1566
    [Google Scholar]
  159. 159. 
    Ling J, Hutchinson M, Antono E, Paradiso S, Meredig B 2017. High-dimensional materials and process optimization using data-driven experimental design with well-calibrated uncertainty estimates. Integr. Mater. Manuf. Innov. 6:3207–17
    [Google Scholar]
  160. 160. 
    Cortés-Ciriano I, Bender A. 2019. Deep confidence: a computationally efficient framework for calculating reliable prediction errors for deep neural networks. J. Chem. Inf. Model. 59:31269–81
    [Google Scholar]
  161. 161. 
    Gal Y, Ghahramani Z. 2016. Dropout as a Bayesian approximation: representing model uncertainty in deep learning. Proceedings of the 33rd International Conference on International Conference on Machine Learning 481050–59 New York: JMLR
    [Google Scholar]
  162. 162. 
    Hill J, Mulholland G, Persson K, Seshadri R, Wolverton C, Meredig B 2016. Materials science with large-scale data and informatics: unlocking new opportunities. MRS Bull 41:5399–409
    [Google Scholar]
  163. 163. 
    Jain A, Persson KA, Ceder G 2016. Research update: The materials genome initiative: data sharing and the impact of collaborative ab initio databases. APL Mater 4:553102
    [Google Scholar]
  164. 164. 
    Hall E, Stemmer S, Zheng H, Zhu Y 2014. Future of electron scattering and diffraction: next-generation instrumentation and beyond Rep., Basic Energy Sci. Workshop Future Electron Scatt. Diffr. US Dep. Energy Off. Sci. Washington, DC:
  165. 165. 
    Henry S, Berardinis L. 2015. Materials data analytics: a path-finding workshop: workshop results Rep., ASM Int Columbus, OH:
  166. 166. 
    Belianinov A, Vasudevan R, Strelcov E, Steed C, Yang SM et al. 2015. Big data and deep data in scanning and electron microscopies: deriving functionality from multidimensional data sets. Adv. Struct. Chem. Imaging 1:6
    [Google Scholar]
  167. 167. 
    Agrawal A, Choudhary A. 2016. Perspective: materials informatics and big data: realization of the “fourth paradigm” of science in materials science. APL Mater 4:553208
    [Google Scholar]
  168. 168. 
    Liu Y, Zhao T, Ju W, Shi S 2017. Materials discovery and design using machine learning. J. Mater. 3:3159–77
    [Google Scholar]
  169. 169. 
    O'Mara J, Meredig B, Michel K 2016. Materials data infrastructure: a case study of the Citrination platform to examine data import, storage, and access. JOM 68:82031–34
    [Google Scholar]
  170. 170. 
    Raccuglia P, Elbert KC, Adler PDF, Falk C, Wenny MB et al. 2016. Machine-learning-assisted materials discovery using failed experiments. Nature 533:760173–76
    [Google Scholar]
  171. 171. 
    Holzinger A. 2016. Interactive machine learning for health informatics: When do we need the human-in-the-loop. Brain Inform 3:2119–31
    [Google Scholar]
  172. 172. 
    Duros V, Grizou J, Sharma A, Mehr SHM, Bubliauskas A et al. 2019. Intuition-enabled machine learning beats the competition when joint human-robot teams perform inorganic chemical experiments. J. Chem. Inf. Model. 59:62664–71
    [Google Scholar]
  173. 173. 
    Gómez-Bombarelli R, Aguilera-Iparraguirre J, Hirzel TD, Duvenaud D, Maclaurin D et al. 2016. Design of efficient molecular organic light-emitting diodes by a high-throughput virtual screening and experimental approach. Nat. Mater. 15:101120–27
    [Google Scholar]
  174. 174. 
    Sun X, Krakauer NJ, Politowicz A, Chen W-T, Li Q 2020. Assessing graph-based deep learning models for predicting flash point. Mol. Inform. https://doi.org/10.1002/minf.201900101
    [Crossref] [Google Scholar]
/content/journals/10.1146/annurev-matsci-070218-010015
Loading
/content/journals/10.1146/annurev-matsci-070218-010015
Loading

Data & Media loading...

Supplemental Material

Supplementary Data

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error