1932

Abstract

High-throughput data generation methods and machine learning (ML) algorithms have given rise to a new era of computational materials science by learning the relations between composition, structure, and properties and by exploiting such relations for design. However, to build these connections, materials data must be translated into a numerical form, called a representation, that can be processed by an ML model. Data sets in materials science vary in format (ranging from images to spectra), size, and fidelity. Predictive models vary in scope and properties of interest. Here, we review context-dependent strategies for constructing representations that enable the use of materials as inputs or outputs for ML models. Furthermore, we discuss how modern ML techniques can learn representations from data and transfer chemical and physical information between tasks. Finally, we outline high-impact questions that have not been fully resolved and thus require further investigation.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-matsci-080921-085947
2023-07-03
2024-06-20
Loading full text...

Full text loading...

/deliver/fulltext/matsci/53/1/annurev-matsci-080921-085947.html?itemId=/content/journals/10.1146/annurev-matsci-080921-085947&mimeType=html&fmt=ahah

Literature Cited

  1. 1.
    Gómez-Bombarelli R, Wei JN, Duvenaud D, Hernández-Lobato JM, Sánchez-Lengeling B et al. 2018. Automatic chemical design using a data-driven continuous representation of molecules. ACS Cent. Sci. 4:2268–76
    [Google Scholar]
  2. 2.
    Peng J, Schwalbe-Koda D, Akkiraju K, Xie T, Giordano L et al. 2022. Human- and machine-centred designs of molecules and materials for sustainability and decarbonization. Nat. Rev. Mater. 7:991–1009
    [Google Scholar]
  3. 3.
    Pyzer-Knapp EO, Suh C, Gómez-Bombarelli R, Aguilera-Iparraguirre J, Aspuru-Guzik A. 2015. What is high-throughput virtual screening? A perspective from organic materials discovery. Annu. Rev. Mater. Res. 45:195–216
    [Google Scholar]
  4. 4.
    Jain A, Ong SP, Hautier G, Chen W, Richards WD et al. 2013. Commentary: the Materials Project: a materials genome approach to accelerating materials innovation. APL Mater. 1:1011002
    [Google Scholar]
  5. 5.
    Kirklin S, Saal JE, Meredig B, Thompson A, Doak JW et al. 2015. The Open Quantum Materials Database (OQMD): assessing the accuracy of DFT formation energies. npj Comput. Mater. 1:115010
    [Google Scholar]
  6. 6.
    Chanussot L, Das A, Goyal S, Lavril T, Shuaibi M et al. 2021. Open Catalyst 2020 (OC20) dataset and community challenges. ACS Catal. 11:106059–72. Correction. 2021. ACS Catal. 11:2113062–65
    [Google Scholar]
  7. 7.
    Curtarolo S, Setyawan W, Wang S, Xue J, Yang K et al. 2012. Aflowlib.org: a distributed materials properties repository from high-throughput ab initio calculations. Comput. Mater. Sci. 58:227–35
    [Google Scholar]
  8. 8.
    Ward L, Dunn A, Faghaninia A, Zimmermann NE, Bajaj S et al. 2018. Matminer: an open source toolkit for materials data mining. Comput. Mater. Sci. 152:60–69
    [Google Scholar]
  9. 9.
    Tran R, Lan J, Shuaibi M, Goyal S, Wood BM et al. 2022. The Open Catalyst 2022 (OC22) dataset and challenges for oxide electrocatalysis. arXiv:2206.08917 [cond-mat.mtrl-sci]
  10. 10.
    Huo H, Rupp M. 2022. Unified representation of molecules and crystals for machine learning. Mach. Learn. Sci. Technol. 3:045017
    [Google Scholar]
  11. 11.
    Faber F, Lindmaa A, von Lilienfeld OA, Armiento R. 2015. Crystal structure representations for machine learning models of formation energies. Int. J. Quantum Chem. 115:161094–101
    [Google Scholar]
  12. 12.
    Bartók AP, Kondor R, Csányi G. 2013. On representing chemical environments. Phys. Rev. B 87:18184115
    [Google Scholar]
  13. 13.
    von Lilienfeld OA, Ramakrishnan R, Rupp M, Knoll A. 2015. Fourier series of atomic radial distribution functions: a molecular fingerprint for machine learning models of quantum chemical properties. Int. J. Quantum Chem. 115:161084–93
    [Google Scholar]
  14. 14.
    Musil F, Grisafi A, Bartók AP, Ortner C, Csányi G, Ceriotti M. 2021. Physics-inspired structural representations for molecules and materials. Chem. Rev. 121:169759–815
    [Google Scholar]
  15. 15.
    Glass CW, Oganov AR, Hansen N. 2006. USPEX—evolutionary crystal structure prediction. Comput. Phys. Commun. 175:11–12713–20
    [Google Scholar]
  16. 16.
    Wang Y, Lv J, Zhu L, Ma Y. 2010. Crystal structure prediction via particle-swarm optimization. Phys. Rev. B 82:9094116
    [Google Scholar]
  17. 17.
    Ward L, Liu R, Krishna A, Hegde VI, Agrawal A et al. 2017. Including crystal structure attributes in machine learning models of formation energies via Voronoi tessellations. Phys. Rev. B 96:2024104
    [Google Scholar]
  18. 18.
    Xie T, Grossman JC. 2018. Crystal graph convolutional neural networks for an accurate and interpretable prediction of material properties. Phys. Rev. Lett. 120:14145301
    [Google Scholar]
  19. 19.
    Ward L, Agrawal A, Choudhary A, Wolverton C. 2016. A general-purpose machine learning framework for predicting properties of inorganic materials. npj Comput. Mater. 2:116028
    [Google Scholar]
  20. 20.
    Haynes WM. 2016. CRC Handbook of Chemistry and Physics Boca Raton, FL: CRC Press
    [Google Scholar]
  21. 21.
    Frey NC, Akinwande D, Jariwala D, Shenoy VB. 2020. Machine learning-enabled design of point defects in 2D materials for quantum and neuromorphic information processing. ACS Nano 14:1013406–17
    [Google Scholar]
  22. 22.
    Hoffmann J, Maestrati L, Sawada Y, Tang J, Sellier JM, Bengio Y. 2019. Data-driven approach to encoding and decoding 3-D crystal structures. arXiv:1909.00949 [cs.LG]
  23. 23.
    Noh J, Kim J, Stein HS, Sanchez-Lengeling B, Gregoire JM et al. 2019. Inverse design of solid-state materials via a continuous representation. Matter 1:51370–84
    [Google Scholar]
  24. 24.
    Court CJ, Yildirim B, Jain A, Cole JM. 2020. 3-D inorganic crystal structure generation and property prediction via representation learning. J. Chem. Inform. Model. 60:104518–35
    [Google Scholar]
  25. 25.
    Choubisa H, Askerka M, Ryczko K, Voznyy O, Mills K et al. 2020. Crystal site feature embedding enables exploration of large chemical spaces. Matter 3:2433–48
    [Google Scholar]
  26. 26.
    Behler J, Parrinello M. 2007. Generalized neural-network representation of high-dimensional potential-energy surfaces. Phys. Rev. Lett. 98:14146401
    [Google Scholar]
  27. 27.
    Behler J. 2011. Atom-centered symmetry functions for constructing high-dimensional neural network potentials. J. Chem. Phys. 134:7074106
    [Google Scholar]
  28. 28.
    Smith JS, Isayev O, Roitberg AE. 2017. ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost. Chem. Sci. 8:43192–203
    [Google Scholar]
  29. 29.
    De S, Bartók AP, Csányi G, Ceriotti M. 2016. Comparing molecules and solids across structural and alchemical space. Phys. Chem. Chem. Phys. 18:2013754–69
    [Google Scholar]
  30. 30.
    Schwalbe-Koda D, Jensen Z, Olivetti E, Gómez-Bombarelli R. 2019. Graph similarity drives zeolite diffusionless transformations and intergrowth. Nat. Mater. 18:111177–81
    [Google Scholar]
  31. 31.
    Dragoni D, Daff TD, Csányi G, Marzari N. 2018. Achieving DFT accuracy with a machine-learning interatomic potential: thermomechanics and defects in bcc ferromagnetic iron. Phys. Rev. Mater. 2:1013808
    [Google Scholar]
  32. 32.
    Willatt MJ, Musil F, Ceriotti M. 2019. Atom-density representations for machine learning. J. Chem. Phys. 150:15154110
    [Google Scholar]
  33. 33.
    Schütt KT, Glawe H, Brockherde F, Sanna A, Müller KR, Gross EK. 2014. How to represent crystal structures for machine learning: towards fast prediction of electronic properties. Phys. Rev. B 89:20205118
    [Google Scholar]
  34. 34.
    Moosavi SM, Novotny , Ongari D, Moubarak E, Asgari M et al. 2022. A data-science approach to predict the heat capacity of nanoporous materials. Nat. Mater. 21:121419–25
    [Google Scholar]
  35. 35.
    Isayev O, Oses C, Toher C, Gossett E, Curtarolo S, Tropsha A. 2017. Universal fragment descriptors for predicting properties of inorganic crystals. Nat. Commun. 8:115679
    [Google Scholar]
  36. 36.
    Bartók AP, De S, Poelking C, Bernstein N, Kermode JR et al. 2017. Machine learning unifies the modeling of materials and molecules. Sci. Adv. 3:12e1701816
    [Google Scholar]
  37. 37.
    Lazar EA, Lu J, Rycroft CH. 2022. Voronoi cell analysis: the shapes of particle systems. Am. J. Phys. 90:6469
    [Google Scholar]
  38. 38.
    Krishnapriyan AS, Montoya J, Haranczyk M, Hummelshøj J, Morozov D. 2021. Machine learning with persistent homology and chemical word embeddings improves prediction accuracy and interpretability in metal-organic frameworks. Sci. Rep. 11:18888
    [Google Scholar]
  39. 39.
    Rupp M, Tkatchenko A, Müller KR, von Lilienfeld OA. 2012. Fast and accurate modeling of molecular atomization energies with machine learning. Phys. Rev. Lett. 108:5058301
    [Google Scholar]
  40. 40.
    Sauceda HE, Gálvez-González LE, Chmiela S, Paz-Borbön LO, Müller KR, Tkatchenko A. 2022. BIGDML—towards accurate quantum machine learning force fields for materials. Nat. Commun. 13:13733
    [Google Scholar]
  41. 41.
    Sanchez JM, Ducastelle F, Gratias D. 1984. Generalized cluster description of multicomponent systems. Phys. A Stat. Mech. Appl. 128:1–2334–50
    [Google Scholar]
  42. 42.
    Chang JH, Kleiven D, Melander M, Akola J, Garcia-Lastra JM, Vegge T. 2019. CLEASE: a versatile and user-friendly implementation of cluster expansion method. J. Phys. Condens. Matter 31:32325901
    [Google Scholar]
  43. 43.
    Hart GL, Mueller T, Toher C, Curtarolo S. 2021. Machine learning for alloys. Nat. Rev. Mater. 6:8730–55
    [Google Scholar]
  44. 44.
    Nyshadham C, Rupp M, Bekker B, Shapeev AV, Mueller T et al. 2019. Machine-learned multi-system surrogate models for materials prediction. npj Comput. Mater. 5:151
    [Google Scholar]
  45. 45.
    Yang JH, Chen T, Barroso-Luque L, Jadidi Z, Ceder G. 2022. Approaches for handling high-dimensional cluster expansions of ionic systems. npj Comput. Mater. 8:1133
    [Google Scholar]
  46. 46.
    Drautz R. 2019. Atomic cluster expansion for accurate and transferable interatomic potentials. Phys. Rev. B 99:1014104
    [Google Scholar]
  47. 47.
    Batatia I, Kovács DP, Simm GNC, Ortner C, Csányi G. 2022. Mace: higher order equivariant message passing neural networks for fast and accurate force fields. arXiv:2206.07697 [stat.ML]
  48. 48.
    Carlsson G. 2020. Topological methods for data modelling. Nat. Rev. Phys. 2:12697–708
    [Google Scholar]
  49. 49.
    Pun CS, Lee SX, Xia K. 2022. Persistent-homology-based machine learning: a survey and a comparative study. Artif. Intell. Rev. 55:75169–213
    [Google Scholar]
  50. 50.
    Jiang Y, Chen D, Chen X, Li T, Wei GW, Pan F. 2021. Topological representations of crystalline compounds for the machine-learning prediction of materials properties. npj Comput. Mater. 7:128
    [Google Scholar]
  51. 51.
    Lee Y, Barthel SD, Dłotko P, Moosavi SM, Hess K, Smit B. 2018. High-throughput screening approach for nanoporous materials genome using topological data analysis: application to zeolites. J. Chem. Theory Comput. 14:84427–37
    [Google Scholar]
  52. 52.
    Buchet M, Hiraoka Y, Obayashi I. 2018. Persistent homology and materials informatics. Nanoinformatics I Tanaka 75–95. Singapore: Springer Nat.
    [Google Scholar]
  53. 53.
    Duvenaud D, Maclaurin D, Aguilera-Iparraguirre J, Gómez-Bombarelli R, Hirzel T et al. 2015. Convolutional networks on graphs for learning molecular fingerprints. arXiv:1509.09292 [cs.LG]
  54. 54.
    Reiser P, Neubert M, Eberhard A, Torresi L, Zhou C et al. 2022. Graph neural networks for materials science and chemistry. Commun. Mater. 3:193
    [Google Scholar]
  55. 55.
    Schütt KT, Sauceda HE, Kindermans PJ, Tkatchenko A, Müller KR. 2018. SchNet – a deep learning architecture for molecules and materials. J. Chem. Phys. 148:24241722
    [Google Scholar]
  56. 56.
    Chen C, Ye W, Zuo Y, Zheng C, Ong SP. 2019. Graph networks as a universal machine learning framework for molecules and crystals. Chem. Mater. 31:93564–72
    [Google Scholar]
  57. 57.
    Fung V, Zhang J, Juarez E, Sumpter BG. 2021. Benchmarking graph neural networks for materials chemistry. npj Comput. Mater. 7:184
    [Google Scholar]
  58. 58.
    Chen C, Zuo Y, Ye W, Li X, Ong SP. 2021. Learning properties of ordered and disordered materials from multi-fidelity data. Nat. Comput. Sci. 1:146–53
    [Google Scholar]
  59. 59.
    Choudhary K, DeCost B. 2021. Atomistic line graph neural network for improved materials property predictions. npj Comput. Mater. 7:1185
    [Google Scholar]
  60. 60.
    Park CW, Wolverton C. 2020. Developing an improved crystal graph convolutional neural network framework for accelerated materials discovery. Phys. Rev. Mater. 4:6063801
    [Google Scholar]
  61. 61.
    Karamad M, Magar R, Shi Y, Siahrostami S, Gates ID, Farimani AB. 2020. Orbital graph convolutional neural network for material property prediction. Phys. Rev. Mater. 4:9093801
    [Google Scholar]
  62. 62.
    Karaguesian J, Lunger JR, Shao-Horn Y, Gomez-Bombarelli R. 2021. Crystal graph convolutional neural networks for per-site property prediction. Paper presented at the 35th Conference on Neural Information Processing Systems, online, Dec. 13
    [Google Scholar]
  63. 63.
    Gong S, Xie T, Shao-Horn Y, Gomez-Bombarelli R, Grossman JC. 2022. Examining graph neural networks for crystal structures: limitations and opportunities for capturing periodicity. arXiv:2208.05039 [cond-mat.mtrl-sci]
  64. 64.
    Schütt KT, Unke OT, Gastegger M. 2021. Equivariant message passing for the prediction of tensorial properties and molecular spectra. arXiv:2102.03150 [cs.LG]
  65. 65.
    Cheng J, Zhang C, Dong L. 2021. A geometric-information-enhanced crystal graph network for predicting properties of materials. Commun. Mater. 2:192
    [Google Scholar]
  66. 66.
    Louis SY, Zhao Y, Nasiri A, Wang X, Song Y et al. 2020. Graph convolutional neural networks with global attention for improved materials property prediction. Phys. Chem. Chem. Phys. 22:3218141–48
    [Google Scholar]
  67. 67.
    Yu H, Hong L, Chen S, Gong X, Xiang H 2023. Capturing long-range interaction with reciprocal space neural network. arXiv:2211.16684 [cond-mat.mtrl-sci]
  68. 68.
    Kong S, Ricci F, Guevarra D, Neaton JB, Gomes CP, Gregoire JM. 2022. Density of states prediction for materials discovery via contrastive learning from probabilistic embeddings. Nat. Commun. 13:1949
    [Google Scholar]
  69. 69.
    Batzner S, Musaelian A, Sun L, Geiger M, Mailoa JP et al. 2021. E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nat. Commun. 13:12453
    [Google Scholar]
  70. 70.
    Geiger M, Smidt T 2022. e3nn: Euclidean neural networks. arXiv:2207.09453 [cs.LG]
  71. 71.
    Thomas N, Smidt T, Kearnes S, Yang L, Li L et al. 2018. Tensor field networks: rotation- and translation-equivariant neural networks for 3d point clouds. arXiv:1802.08219 [cs.LG]
  72. 72.
    Gasteiger J, Becker F, Günnemann S. 2021. GemNet: universal directional graph neural networks for molecules. Adv. Neural Inform. Proc. Syst. 34:6790–802
    [Google Scholar]
  73. 73.
    Gasteiger J, Shuaibi M, Sriram A, Günnemann S, Ulissi Z et al. 2022. GemNet-OC: developing graph neural networks for large and diverse molecular simulation datasets. arXiv:2204.02782 [cs.LG]
  74. 74.
    Chen Z, Andrejevic N, Smidt T, Ding Z, Xu Q et al. 2021. Direct prediction of phonon density of states with Euclidean neural networks. Adv. Sci. 8:122004214
    [Google Scholar]
  75. 75.
    Kolluru A, Shuaibi M, Palizhati A, Shoghi N, Das A et al. 2022. Open challenges in developing generalizable large-scale machine-learning models for catalyst discovery. ACS Catal. 12:148572–81
    [Google Scholar]
  76. 76.
    Gibson J, Hire A, Hennig RG. 2022. Data-augmentation for graph neural network learning of the relaxed energies of unrelaxed structures. npj Comput. Mater. 8:1211
    [Google Scholar]
  77. 77.
    Schmidt J, Pettersson L, Verdozzi C, Botti S, Marques MA. 2021. Crystal graph attention networks for the prediction of stable materials. Sci. Adv. 7:497948
    [Google Scholar]
  78. 78.
    Zuo Y, Qin M, Chen C, Ye W, Li X et al. 2021. Accelerating materials discovery with Bayesian optimization and graph deep learning. Mater. Today 51:126–35
    [Google Scholar]
  79. 79.
    Dunn A, Wang Q, Ganose A, Dopp D, Jain A. 2020. Benchmarking materials property prediction methods: the Matbench test set and Automatminer reference algorithm. npj Comput. Mater. 6:1138
    [Google Scholar]
  80. 80.
    Callister WD, Rethwisch DG. 2018. Materials science and engineering: an introduction Hoboken, NJ: Wiley. , 10th ed..
    [Google Scholar]
  81. 81.
    Pauling L. 1929. The principles determining the structure of complex ionic crystals. J. Am. Chem. Soc. 51:41010–26
    [Google Scholar]
  82. 82.
    George J, Waroquiers D, Stefano DD, Petretto G, Rignanese GM, Hautier G. 2020. The limited predictive power of the Pauling rules. Angew. Chem. Int. Ed. 132:197639–45
    [Google Scholar]
  83. 83.
    Meredig B, Agrawal A, Kirklin S, Saal JE, Doak JW et al. 2014. Combinatorial screening for new materials in unconstrained composition space with machine learning. Phys. Rev. B 89:9094104
    [Google Scholar]
  84. 84.
    Bartel CJ, Trewartha A, Wang Q, Dunn A, Jain A, Ceder G. 2020. A critical examination of compound stability predictions from machine-learned formation energies. npj Comput. Mater. 6:197
    [Google Scholar]
  85. 85.
    Stanev V, Oses C, Kusne AG, Rodriguez E, Paglione J et al. 2018. Machine learning modeling of superconducting critical temperature. npj Comput. Mater. 4:129
    [Google Scholar]
  86. 86.
    Faber FA, Lindmaa A, Lilienfeld OAV, Armiento R. 2016. Machine learning energies of 2 million elpasolite (ABC2D6) crystals. Phys. Rev. Lett. 117:13135502
    [Google Scholar]
  87. 87.
    Oliynyk AO, Antono E, Sparks TD, Ghadbeigi L, Gaultois MW et al. 2016. High-throughput machine-learning-driven synthesis of full-heusler compounds. Chem. Mater. 28:207324–31
    [Google Scholar]
  88. 88.
    Ouyang R, Curtarolo S, Ahmetcik E, Scheffler M, Ghiringhelli LM. 2018. SISSO: a compressed-sensing method for identifying the best low-dimensional descriptor in an immensity of offered candidates. Phys. Rev. Mater. 2:8083802
    [Google Scholar]
  89. 89.
    Ghiringhelli LM, Vybiral J, Ahmetcik E, Ouyang R, Levchenko SV et al. 2017. Learning physical descriptors for materials science by compressed sensing. N. J. Phys. 19:2023017
    [Google Scholar]
  90. 90.
    Ghiringhelli LM, Vybiral J, Levchenko SV, Draxl C, Scheffler M. 2015. Big data of materials science: critical role of the descriptor. Phys. Rev. Lett. 114:10105503
    [Google Scholar]
  91. 91.
    Bartel CJ, Sutton C, Goldsmith BR, Ouyang R, Musgrave CB et al. 2019. New tolerance factor to predict the stability of perovskite oxides and halides. Sci. Adv. 5:2eaav0693
    [Google Scholar]
  92. 92.
    Zhou Q, Tang P, Liu S, Pan J, Yan Q, Zhang SC. 2018. Learning atoms for materials discovery. PNAS 115:28E6411–17
    [Google Scholar]
  93. 93.
    Tshitoyan V, Dagdelen J, Weston L, Dunn A, Rong Z et al. 2019. Unsupervised word embeddings capture latent knowledge from materials science literature. Nature 571:776395–98
    [Google Scholar]
  94. 94.
    Jha D, Ward L, Paul A, Liao W-k, Choudhary A et al. 2018. ElemNet: deep learning the chemistry of materials from only elemental composition. Sci. Rep. 8:117593
    [Google Scholar]
  95. 95.
    Goodall RE, Lee AA. 2020. Predicting materials properties without crystal structure: deep representation learning from stoichiometry. Nat. Commun. 11:16280
    [Google Scholar]
  96. 96.
    Wang AYT, Kauwe SK, Murdock RJ, Sparks TD. 2021. Compositionally restricted attention-based network for materials property predictions. npj Comput. Mater. 7:177
    [Google Scholar]
  97. 97.
    Zhang Z, Tehrani AM, Oliynyk AO, Day B, Brgoch J. 2021. Finding the next superhard material through ensemble learning. Adv. Mater. 33:52005112
    [Google Scholar]
  98. 98.
    Oliynyk AO, Adutwum LA, Rudyk BW, Pisavadia H, Lotfi S et al. 2017. Disentangling structural confusion through machine learning: structure prediction and polymorphism of equiatomic ternary phases ABC. J. Am. Chem. Soc. 139:4917870–81
    [Google Scholar]
  99. 99.
    Tian SIP, Walsh A, Ren Z, Li Q, Buonassisi T. 2022. What information is necessary and sufficient to predict materials properties using machine learning?. arXiv:2206.04968 [cond-mat.mtrl-sci]
  100. 100.
    Artrith N. 2019. Machine learning for the modeling of interfaces in energy storage and conversion materials. J. Phys. Energy 1:3032002
    [Google Scholar]
  101. 101.
    Rao RR, Kolb MJ, Giordano L, Pedersen AF, Katayama Y et al. 2020. Operando identification of site-dependent water oxidation activity on ruthenium dioxide single-crystal surfaces. Nat. Catal. 3:516–25
    [Google Scholar]
  102. 102.
    Varley JB, Samanta A, Lordi V. 2017. Descriptor-based approach for the prediction of cation vacancy formation energies and transition levels. J. Phys. Chem. Lett. 8:205059–63
    [Google Scholar]
  103. 103.
    Zhang X, Wang H, Hickel T, Rogal J, Li Y, Neugebauer J. 2020. Mechanism of collective interstitial ordering in Fe–C alloys. Nat. Mater. 19:8849–54
    [Google Scholar]
  104. 104.
    Deringer VL, Bartók AP, Bernstein N, Wilkins DM, Ceriotti M, Csányi G. 2021. Gaussian process regression for materials and molecules. Chem. Rev. 121:1610073–141
    [Google Scholar]
  105. 105.
    Wan Z, Wang QD, Liu D, Liang J. 2021. Data-driven machine learning model for the prediction of oxygen vacancy formation energy of metal oxide materials. Phys. Chem. Chem. Phys. 23:2915675–84
    [Google Scholar]
  106. 106.
    Witman M, Goyal A, Ogitsu T, McDaniel A, Lany S 2022. Graph neural network modeling of vacancy formation enthalpy for materials discovery and its application in solar thermochemical water splitting. ChemRxiv. https://doi.org/10.26434/chemrxiv-2022-frcns
  107. 107.
    Medasani B, Gamst A, Ding H, Chen W, Persson KA et al. 2016. Predicting defect behavior in b2 intermetallics by merging ab initio modeling and machine learning. npj Comput. Mater. 2:11
    [Google Scholar]
  108. 108.
    Ziletti A, Kumar D, Scheffler M, Ghiringhelli LM. 2018. Insightful classification of crystal structures using deep learning. Nat. Commun. 9:12775
    [Google Scholar]
  109. 109.
    Medford AJ, Vojvodic A, Hummelshøj JS, Voss J, Abild-Pedersen F et al. 2015. From the Sabatier principle to a predictive theory of transition-metal heterogeneous catalysis. J. Catal. 328:36–42
    [Google Scholar]
  110. 110.
    Zhao ZJ, Liu S, Zha S, Cheng D, Studt F et al. 2019. Theory-guided design of catalytic materials using scaling relationships and reactivity descriptors. Nat. Rev. Mater. 4:12792–804
    [Google Scholar]
  111. 111.
    Hwang J, Rao RR, Giordano L, Katayama Y, Yu Y, Shao-Horn Y. 2017. Perovskites in catalysis and electrocatalysis. Science 358:6364751–56
    [Google Scholar]
  112. 112.
    Calle-Vallejo F, Tymoczko J, Colic V, Vu QH, Pohl MD et al. 2015. Finding optimal surface sites on heterogeneous catalysts by counting nearest neighbors. Science 350:6257185–89
    [Google Scholar]
  113. 113.
    Fung V, Hu G, Ganesh P, Sumpter BG. 2021. Machine learned features from density of states for accurate adsorption energy prediction. Nat. Commun. 12:188
    [Google Scholar]
  114. 114.
    Back S, Yoon J, Tian N, Zhong W, Tran K, Ulissi ZW. 2019. Convolutional neural network of atomic surface structures to predict binding energies for high-throughput screening of catalysts. J. Phys. Chem. Lett. 10:154401–8
    [Google Scholar]
  115. 115.
    Tran K, Ulissi ZW. 2018. Active learning across intermetallics to guide discovery of electrocatalysts for CO2 reduction and H2 evolution. Nat. Catal. 1:9696–703
    [Google Scholar]
  116. 116.
    Ghanekar PG, Deshpande S, Greeley J. 2022. Adsorbate chemical environment-based machine learning framework for heterogenous catalysis. Nat. Commun. 13:15788
    [Google Scholar]
  117. 117.
    Kiyohara S, Oda H, Miyata T, Mizoguchi T. 2016. Prediction of interface structures and energies via virtual screening. Sci. Adv. 2:11e1600746
    [Google Scholar]
  118. 118.
    Hu C, Zuo Y, Chen C, Ong SP, Luo J. 2020. Genetic algorithm-guided deep learning of grain boundary diagrams: addressing the challenge of five degrees of freedom. Mater. Today 38:49–57
    [Google Scholar]
  119. 119.
    Dai M, Demirel MF, Liang Y, Hu JM. 2021. Graph neural networks for an accurate and interpretable prediction of the properties of polycrystalline materials. npj Comput. Mater. 7:103
    [Google Scholar]
  120. 120.
    Huber L, Hadian R, Grabowski B, Neugebauer J. 2018. A machine learning approach to model solute grain boundary segregation. npj Comput. Mater. 4:164
    [Google Scholar]
  121. 121.
    Ye W, Zheng H, Chen C, Ong SP. 2022. A universal machine learning model for elemental grain boundary energies. Scr. Mater. 218:114803
    [Google Scholar]
  122. 122.
    Lazar EA. 2017. Vorotop: Voronoi cell topology visualization and analysis toolkit. Model. Simul. Mater. Sci. Eng. 26:1015011
    [Google Scholar]
  123. 123.
    Priedeman JL, Rosenbrock CW, Johnson OK, Homer ER. 2018. Quantifying and connecting atomic and crystallographic grain boundary structure using local environment representation and dimensionality reduction techniques. Acta Mater. 161:431–43
    [Google Scholar]
  124. 124.
    Rosenbrock CW, Homer ER, Csányi G, Hart GL. 2017. Discovering the building blocks of atomic systems using machine learning: application to grain boundaries. npj Comput. Mater. 3:129
    [Google Scholar]
  125. 125.
    Fujii S, Yokoi T, Fisher CA, Moriwake H, Yoshiya M. 2020. Quantitative prediction of grain boundary thermal conductivities from local atomic environments. Nat. Commun. 11:11854
    [Google Scholar]
  126. 126.
    Sharp TA, Thomas SL, Cubuk ED, Schoenholz SS, Srolovitz DJ, Liu AJ. 2018. Machine learning determination of atomic dynamics at grain boundaries. PNAS 115:4310943–47
    [Google Scholar]
  127. 127.
    Batatia I, Batzner S, Kovács DP, Musaelian A, Simm GNC et al. 2022. The design space of E(3)-equivariant atom-centered interatomic potentials. arXiv:2205.06643 [stat.ML]
  128. 128.
    Axelrod S, Shakhnovich E, Gómez-Bombarelli R. 2022. Excited state non-adiabatic dynamics of large photoswitchable molecules using a chemically transferable machine learning potential. Nat. Commun. 13:13440
    [Google Scholar]
  129. 129.
    Li X, Li B, Yang Z, Chen Z, Gao W, Jiang Q. 2022. A transferable machine-learning scheme from pure metals to alloys for predicting adsorption energies. J. Mater. Chem. A 10:2872–80
    [Google Scholar]
  130. 130.
    Harper DR, Nandy A, Arunachalam N, Duan C, Janet JP, Kulik HJ. 2022. Representations and strategies for transferable machine learning improve model performance in chemical discovery. J. Chem. Phys. 156:7074101
    [Google Scholar]
  131. 131.
    Unke OT, Chmiela S, Gastegger M, Schütt KT, Sauceda HE, Müller KR. 2021. SpookyNet: learning force fields with electronic degrees of freedom and nonlocal effects. Nat. Commun. 12:17273
    [Google Scholar]
  132. 132.
    Husch T, Sun J, Cheng L, Lee SJ, Miller TF. 2021. Improved accuracy and transferability of molecular-orbital-based machine learning: organics, transition-metal complexes, non-covalent interactions, and transition states. J. Chem. Phys. 154:6064108
    [Google Scholar]
  133. 133.
    Zheng X, Zheng P, Zhang RZ. 2018. Machine learning material properties from the periodic table using convolutional neural networks. Chem. Sci. 9:448426–32
    [Google Scholar]
  134. 134.
    Feng S, Fu H, Zhou H, Wu Y, Lu Z, Dong H. 2021. A general and transferable deep learning framework for predicting phase formation in materials. npj Comput. Mater. 7:110
    [Google Scholar]
  135. 135.
    Jha D, Choudhary K, Tavazza F, Liao W-k, Choudhary A et al. 2019. Enhancing materials property prediction by leveraging computational and experimental data using deep transfer learning. Nat. Commun. 10:110
    [Google Scholar]
  136. 136.
    Yamada H, Liu C, Wu S, Koyama Y, Ju S et al. 2019. Predicting materials properties with little data using shotgun transfer learning. ACS Central Sci. 5:101717–30
    [Google Scholar]
  137. 137.
    Gupta V, Choudhary K, Tavazza F, Campbell C, Liao W-k et al. 2021. Cross-property deep transfer learning framework for enhanced predictive analytics on small materials data. Nat. Commun. 12:16595
    [Google Scholar]
  138. 138.
    Lee J, Asahi R. 2021. Transfer learning for materials informatics using crystal graph convolutional neural network. Comput. Mater. Sci. 190:110314
    [Google Scholar]
  139. 139.
    Kolluru A, Shoghi N, Shuaibi M, Goyal S, Das A et al. 2022. Transfer learning using attentions across atomic systems with graph neural networks (TAAG). J. Chem. Phys. 156:18184702
    [Google Scholar]
  140. 140.
    Chen C, Ong SP. 2021. AtomSets as a hierarchical transfer learning framework for small and large materials datasets. npj Comput. Mater. 7:1173
    [Google Scholar]
  141. 141.
    Cubuk ED, Sendek AD, Reed EJ. 2019. Screening billions of candidates for solid lithium-ion conductors: a transfer learning approach for small data. J. Chem. Phys. 150:21214701
    [Google Scholar]
  142. 142.
    Greenman KP, Green WH, Gómez-Bombarelli R. 2022. Multi-fidelity prediction of molecular optical peaks with deep learning. Chem. Sci. 13:41152–62
    [Google Scholar]
  143. 143.
    Kong S, Guevarra D, Gomes CP, Gregoir JM. 2021. Materials representation and transfer learning for multi-property prediction. Appl. Phys. Rev. 8:2021409
    [Google Scholar]
  144. 144.
    Kingma DP, Welling M. 2013. Auto-encoding variational Bayes. arXiv:1312.6114 [stat.ML]
  145. 145.
    Goodfellow IJ, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D et al. 2014. Generative adversarial networks. Adv. Neural Inf. Proc. Syst. 27:2672–80
    [Google Scholar]
  146. 146.
    Nouira A, Sokolovska N, Crivello JC 2018. CrystalGAN: learning to discover crystallographic structures with generative adversarial networks. Proceedings of the AAAI 2019 Spring Symposium on Combining Machine Learning with Knowledge Engineeering A Martin, K Hinkelmann, A Gerber, D Lenat, F van Harmelen, P Clark. https://ceur-ws.org/Vol-2350/paper18.pdf
    [Google Scholar]
  147. 147.
    Shi C, Luo S, Xu M, Tang J. 2021. Learning gradient fields for molecular conformation generation. Proc. Mach. Learn. Res. 139:9558–68
    [Google Scholar]
  148. 148.
    Ho J, Jain A, Abbeel P. 2020. Denoising diffusion probabilistic models. Adv. Neural Inf. Proc. Syst. 33:6840–51
    [Google Scholar]
  149. 149.
    Schwalbe-Koda D, Gómez-Bombarelli R. 2020. Generative models for automatic chemical design. Lect. Notes Phys. 968:445–67
    [Google Scholar]
  150. 150.
    Noh J, Gu GH, Kim S, Jung Y. 2020. Machine-enabled inverse design of inorganic solid materials: promises and challenges. Chem. Sci. 11:194871–81
    [Google Scholar]
  151. 151.
    Fuhr AS, Sumpter BG. 2022. Deep generative models for materials discovery and machine learning-accelerated innovation. Front. Mater. 9:182
    [Google Scholar]
  152. 152.
    Xie T, Fu X, Ganea OE, Barzilay R, Jaakkola T. 2021. Crystal diffusion variational autoencoder for periodic material generation. arXiv:2110.06197 [cs.LG]
  153. 153.
    Yao Z, Sánchez-Lengeling B, Bobbitt NS, Bucior BJ, Kumar SGH et al. 2021. Inverse design of nanoporous crystalline reticular materials with deep generative models. Nat. Mach. Intell. 3:176–86
    [Google Scholar]
  154. 154.
    Ren Z, Tian SIP, Noh J, Oviedo F, Xing G et al. 2022. An invertible crystallographic representation for general inverse design of inorganic crystals with targeted properties. Matter 5:1314–35
    [Google Scholar]
  155. 155.
    Long T, Fortunato NM, Opahle I, Zhang Y, Samathrakis I et al. 2021. Constrained crystals deep convolutional generative adversarial network for the inverse design of crystal structures. npj Comput. Mater. 7:166
    [Google Scholar]
  156. 156.
    Kim S, Noh J, Gu GH, Aspuru-Guzik A, Jung Y. 2020. Generative adversarial networks for crystal structure prediction. ACS Central Sci. 6:81412–20
    [Google Scholar]
  157. 157.
    Fung V, Jia S, Zhang J, Bi S, Yin J, Ganesh P. 2022. Atomic structure generation from reconstructing structural fingerprints. Mach. Learn. Sci. Technol. 3:045018
    [Google Scholar]
  158. 158.
    Kim B, Lee S, Kim J. 2020. Inverse design of porous materials using artificial neural networks. Sci. Adv. 6:1eaax9324
    [Google Scholar]
  159. 159.
    Musaelian A, Batzner S, Johansson A, Sun L, Owen CJ et al. 2023. Nat. Commun. 14:1579
    [Google Scholar]
  160. 160.
    Unke OT, Chmiela S, Sauceda HE, Gastegger M, Poltavsky I et al. 2021. Machine learning force fields. Chem. Rev. 121:1610142–86
    [Google Scholar]
  161. 161.
    Grisafi A, Nigam J, Ceriotti M. 2021. Multi-scale approach for the prediction of atomic scale properties. Chem. Sci. 12:2078
    [Google Scholar]
  162. 162.
    Schaarschmidt M, Riviere M, Ganose AM, Spencer JS, Gaunt AL et al. 2022. Learned force fields are ready for ground state catalyst discovery. arXiv:2209.12466 [cond-mat.mtrl-sci]
  163. 163.
    Lan J, Palizhati A, Shuaibi M, Wood BM, Wander B et al. 2023. AdsorbML: accelerating adsorption energy calculations with machine learning. arXiv:2211.16486 [cond-mat.mtrl-sci]
  164. 164.
    Kresse G, Hafner J. 1993. Ab initio molecular dynamics for liquid metals. Phys. Rev. B 47:1558–61
    [Google Scholar]
  165. 165.
    Chen C, Ong SP. 2022. A universal graph deep learning interatomic potential for the periodic table. Nat. Comput. Sci. 2:11718–28
    [Google Scholar]
  166. 166.
    Legrain F, Carrete J, Roekeghem AV, Curtarolo S, Mingo N. 2017. How chemical composition alone can predict vibrational free energies and entropies of solids. Chem. Mater. 29:156220–27
    [Google Scholar]
  167. 167.
    Zhuo Y, Tehrani AM, Brgoch J. 2018. Predicting the band gaps of inorganic solids by machine learning. J. Phys. Chem. Lett. 9:71668–73
    [Google Scholar]
  168. 168.
    Hsu T, Epting WK, Kim H, Abernathy HW, Hackett GA et al. 2020. Microstructure generation via generative adversarial network for heterogeneous, topologically complex 3D materials. JOM 73:190–102
    [Google Scholar]
  169. 169.
    Song Y, Siriwardane EM, Zhao Y, Hu J. 2020. Computational discovery of new 2D materials using deep learning generative models. ACS Appl. Mater. Interfaces 13:4553303–13
    [Google Scholar]
  170. 170.
    Liu Z, Zhu D, Rodrigues SP, Lee KT, Cai W. 2018. Generative model for the inverse design of metasurfaces. Nano Lett. 18:106570–76
    [Google Scholar]
  171. 171.
    Dan Y, Zhao Y, Li X, Li S, Hu M, Hu J. 2020. Generative adversarial networks (GAN) based efficient sampling of chemical composition space for inverse design of inorganic materials. npj Comput. Mater. 6:184
    [Google Scholar]
/content/journals/10.1146/annurev-matsci-080921-085947
Loading
/content/journals/10.1146/annurev-matsci-080921-085947
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error