1932

Abstract

Thermophysical properties of fluid mixtures are important in many fields of science and engineering. However, experimental data are scarce in this field, so prediction methods are vital. Different types of physical prediction methods are available, ranging from molecular models over equations of state to models of excess properties. These well-established methods are currently being complemented by new methods from the field of machine learning (ML). This review focuses on the rapidly developing interface between these two approaches and gives a structured overview of how physical modeling and ML can be combined to yield hybrid models. We illustrate the different options with examples from recent research and give an outlook on future developments.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-chembioeng-092220-025342
2023-06-08
2024-05-05
Loading full text...

Full text loading...

/deliver/fulltext/chembioeng/14/1/annurev-chembioeng-092220-025342.html?itemId=/content/journals/10.1146/annurev-chembioeng-092220-025342&mimeType=html&fmt=ahah

Literature Cited

  1. 1.
    Moosavi SM, Jablonka KM, Smit B. 2020. The role of machine learning in the understanding and design of materials. J. Am. Chem. Soc. 142:20273–87
    [Google Scholar]
  2. 2.
    Butler KT, Davies DW, Cartwright H, Isayev O, Walsh A. 2018. Machine learning for molecular and materials science. Nature 559:547–55
    [Google Scholar]
  3. 3.
    Schmidt J, Marques MR, Botti S, Marques MA. 2019. Recent advances and applications of machine learning in solid-state materials science. npj Comput. Mater. 5:83
    [Google Scholar]
  4. 4.
    Wei J, Chu X, Sun XY, Xu K, Deng HX et al. 2019. Machine learning in materials science. InfoMat 1:338–58
    [Google Scholar]
  5. 5.
    Guggenheim EA. 1945. The principle of corresponding states. J. Phys. Chem. 13:253–61
    [Google Scholar]
  6. 6.
    Leland TW, Chappelear PS. 1968. The corresponding states principle—a review of current theory and practice. Ind. Eng. Chem. 60:15–43
    [Google Scholar]
  7. 7.
    Strieth-Kalthoff F, Sandfort F, Segler MH, Glorius F. 2020. Machine learning the ropes: principles, applications and directions in synthetic chemistry. Chem. Soc. Rev. 49:6154–68
    [Google Scholar]
  8. 8.
    Mater AC, Coote ML. 2019. Deep learning in chemistry. J. Chem. Inf. Model. 59:2545–59
    [Google Scholar]
  9. 9.
    Meuwly M. 2021. Machine learning for chemical reactions. Chem. Rev. 121:10218–39
    [Google Scholar]
  10. 10.
    Goh GB, Hodas NO, Vishnu A. 2017. Deep learning for computational chemistry. J. Comput. Chem. 38:1291–307
    [Google Scholar]
  11. 11.
    Wang CW, Wang J, Liu YS, Li J, Peng XL et al. 2021. Prediction of the ideal-gas thermodynamic properties for water. J. Mol. Liq. 321:114912
    [Google Scholar]
  12. 12.
    Graham RS, Wheatley RJ. 2022. Machine learning for non-additive intermolecular potentials: from quantum chemistry to first-principles predictions. Chem. Commun. 58:6898–901
    [Google Scholar]
  13. 13.
    Onken U, Rarey-Nies J, Gmehling J. 1989. The Dortmund Data Bank: a computerized system for the retrieval, correlation, and prediction of thermodynamic properties of mixtures. Int. J. Thermophys. 10:739–47
    [Google Scholar]
  14. 14.
    Thomson G. 1996. The DIPPR databases. Int. J. Thermophys. 17:223–32
    [Google Scholar]
  15. 15.
    Linstrom P. 1998. NIST chemistry webbook Stand. Ref. Database 69, Natl. Inst. Stand. Technol. (NIST) Gaithersburg, MD:
  16. 16.
    Pak M, Kim S. 2017. A review of deep learning in image recognition. 2017 4th International Conference on Computer Applications and Information Processing Technology (CAIPT)1–3. Piscataway, NJ: IEEE
    [Google Scholar]
  17. 17.
    Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X et al. 2020. An image is worth 16×16 words: transformers for image recognition at scale. arXiv:2010.11929 [cs.CV]
  18. 18.
    Shinde PP, Shah S. 2018. A review of machine learning and deep learning applications. 2018 4th International Conference on Computing Communication Control and Automation (ICCUBEA)1–6. Piscataway, NJ: IEEE
    [Google Scholar]
  19. 19.
    Chowdhary KR 2020. Natural language processing. Fundamentals of Artificial Intelligence KR Chowdhary 603–49. Berlin: Springer
    [Google Scholar]
  20. 20.
    Hirschberg J, Manning CD. 2015. Advances in natural language processing. Science 349:261–66
    [Google Scholar]
  21. 21.
    Jirasek F, Hasse H. 2021. Machine learning of thermophysical properties. Fluid Phase Equilib. 549:113206
    [Google Scholar]
  22. 22.
    Venkatasubramanian V. 2019. The promise of artificial intelligence in chemical engineering: Is it here, finally?. AIChE J. 65:466–78
    [Google Scholar]
  23. 23.
    Sofos F, Stavrogiannis C, Exarchou-Kouveli KK, Akabua D, Charilas G, Karakasidis TE 2022. Current trends in fluid research in the era of artificial intelligence: a review. Fluids 7:116
    [Google Scholar]
  24. 24.
    Burnham JF. 2006. Scopus database: a review. Biomed. Digit. Libr. 3:1
    [Google Scholar]
  25. 25.
    Hastie T, Tibshirani R, Friedman JH, Friedman JH. 2009. The Elements of Statistical Learning, Vol. 2: Data Mining, Inference, and Prediction Berlin: Springer
    [Google Scholar]
  26. 26.
    Murphy KP. 2012. Machine Learning: A Probabilistic Perspective Cambridge, MA: MIT Press
  27. 27.
    Morrison M. 2015. Reconstructing Reality: Models, Mathematics, and Simulations Oxford, UK: Oxford Univ. Press
  28. 28.
    Lenhard J. 2019. Calculated Surprises: A Philosophy of Computer Simulation Oxford, UK: Oxford Univ. Press
  29. 29.
    Hasse H, Lenhard J 2017. Boon and bane: on the role of adjustable parameters in simulation models. Mathematics as a Tool J Lenhard, M Carrier 93–115. Berlin: Springer
    [Google Scholar]
  30. 30.
    Dietterich T. 1995. Overfitting and undercomputing in machine learning. ACM Comput. Surv. 27:326–27
    [Google Scholar]
  31. 31.
    Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R. 2014. Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15:1929–58
    [Google Scholar]
  32. 32.
    Jabbar HK, Khan RZ 2015. Methods to avoid over-fitting and under-fitting in supervised machine learning (comparative study). Computer Science, Communication and Instrumentation Devices J Stephen, H Rohil, S Vasavi 163–72. Singapore: Res. Publ.
    [Google Scholar]
  33. 33.
    Fredenslund A. 2012. Vapor-Liquid Equilibria Using UNIFAC: A Group-Contribution Method Amsterdam: Elsevier
  34. 34.
    Weidlich U, Gmehling J. 1987. A modified UNIFAC model. 1. Prediction of VLE, hE, and γ. Ind. Eng. Chem. Res. 26:1372–81
    [Google Scholar]
  35. 35.
    Constantinescu D, Gmehling J. 2016. Further development of modified UNIFAC (Dortmund): revision and extension 6. J. Chem. Eng. Data 61:2738–48
    [Google Scholar]
  36. 36.
    Holderbaum T, Gmehling J. 1991. PSRK: a group contribution equation of state based on UNIFAC. Fluid Phase Equilib. 70:251–65
    [Google Scholar]
  37. 37.
    Mu T, Rarey J, Gmehling J. 2007. Group contribution prediction of surface charge density profiles for COSMO-RS(Ol). AIChE J. 53:3231–40
    [Google Scholar]
  38. 38.
    Dufal S, Papaioannou V, Sadeqzadeh M, Pogiatzis T, Chremos A et al. 2014. Prediction of thermodynamic properties and phase behavior of fluids and mixtures with the SAFT-γ Mie group-contribution equation of state. J. Chem. Eng. Data 59:3272–88
    [Google Scholar]
  39. 39.
    Haslam AJ, Gonzalez-Perez A, Di Lecce S, Khalit SH, Perdomo FA et al. 2020. Expanding the applications of the SAFT-γ Mie group-contribution equation of state: prediction of thermodynamic properties and phase behavior of mixtures. J. Chem. Eng. Data 65:5862–90
    [Google Scholar]
  40. 40.
    Hartree DR, Hartree W. 1935. Self-consistent field, with exchange, for beryllium. Proc. R. Soc. Lond. A 150:9–33
    [Google Scholar]
  41. 41.
    Møller C, Plesset MS. 1934. Note on an approximation treatment for many-electron systems. Phys. Rev. 46:618–22
    [Google Scholar]
  42. 42.
    Čížek J. 1966. On the correlation problem in atomic and molecular systems. Calculation of wavefunction components in Ursell-type expansion using quantum-field theoretical methods. J. Chem. Phys. 45:4256–66
    [Google Scholar]
  43. 43.
    Hohenberg P, Kohn W. 1964. Inhomogeneous electron gas. Phys. Rev. B 136:864–71
    [Google Scholar]
  44. 44.
    Jensen F. 2017. Introduction to Computational Chemistry New York: Wiley
  45. 45.
    Klamt A. 1995. Conductor-like screening model for real solvents: a new approach to the quantitative calculation of solvation phenomena. J. Phys. Chem. 99:2224–35
    [Google Scholar]
  46. 46.
    Klamt A, Eckert F, Arlt W. 2010. COSMO-RS: an alternative to simulation for calculating thermodynamic properties of liquid mixtures. Annu. Rev. Chem. Biomol. Eng. 1:101–22
    [Google Scholar]
  47. 47.
    Jordan MI, Mitchell TM. 2015. Machine learning: trends, perspectives, and prospects. Science 349:255–60
    [Google Scholar]
  48. 48.
    Kolesnikov A, Zhai X, Beyer L. 2019. Revisiting self-supervised visual representation learning. Proceedings of the IEEE Conference on Computer Vision Pattern Recognition1920–29. Piscataway, NJ: IEEE
    [Google Scholar]
  49. 49.
    Popova M, Isayev O, Tropsha A. 2018. Deep reinforcement learning for de novo drug design. Sci. Adv. 4:eaap7885
    [Google Scholar]
  50. 50.
    Ben-Gal I. 2005. Outlier detection. Data Mining and Knowledge Discovery Handbook: A Complete Guide for Practitioners and Researchers O Maimon, L Rockach 131–46. Dordrecht, Neth: Kluwer
    [Google Scholar]
  51. 51.
    Walfish S. 2006. A review of statistical outlier methods. Pharm. Technol. 30:82–86
    [Google Scholar]
  52. 52.
    Kriegel HP, Kröger P, Zimek A. 2010. Outlier detection techniques Slides presented at 16th ACM SIGKDD Conference on Knowledge Discovery and Data Mining Washington, DC: July 24–28
  53. 53.
    Domingues R, Filippone M, Michiardi P, Zouaoui J. 2018. A comparative evaluation of outlier detection algorithms: experiments and analyses. Pattern Recognit. 74:406–21
    [Google Scholar]
  54. 54.
    Van Ness HC, Byer SM, Gibbs RE. 1973. Vapor-liquid equilibrium. Part I. An appraisal of data reduction methods. AIChE J. 19:238–44
    [Google Scholar]
  55. 55.
    Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M et al. 2016. The FAIR Guiding Principles for scientific data management and stewardship. Sci. Data 3:160018
    [Google Scholar]
  56. 56.
    Boeckhout M, Zielhuis GA, Bredenoord AL. 2018. The FAIR Guiding Principles for data stewardship: fair enough?. Eur. J. Hum. Genet. 26:931–36
    [Google Scholar]
  57. 57.
    Lee J, Lee Y, Kim J, Kosiorek A, Choi S, Teh YW. 2019. Set Transformer: a framework for attention-based permutation-invariant neural networks. Proc. Mach. Learn. Res. 97:3744–53
    [Google Scholar]
  58. 58.
    Carvalho DV, Pereira EM, Cardoso JS 2019. Machine learning interpretability: a survey on methods and metrics. Electronics 8:832
    [Google Scholar]
  59. 59.
    Du M, Liu N, Hu X. 2019. Techniques for interpretable machine learning. Commun. ACM 63:68–77
    [Google Scholar]
  60. 60.
    Mora-Cantallops M, Sánchez-Alonso S, García-Barriocanal E, Sicilia MA. 2021. Traceability for trustworthy AI: a review of models and tools. Big Data Cogn. Comput. 5:20
    [Google Scholar]
  61. 61.
    Molnar C. 2020. Interpretable Machine Learning Morrisville, NC: Lulu
  62. 62.
    Gunning D, Stefik M, Choi J, Miller T, Stumpf S, Yang GZ. 2019. XAI—explainable artificial intelligence. Sci. Robot. 4:eaay7120
    [Google Scholar]
  63. 63.
    Le T, Epa VC, Burden FR, Winkler DA. 2012. Quantitative structure–property relationship modeling of diverse materials properties. Chem. Rev. 112:2889–919
    [Google Scholar]
  64. 64.
    Yusuf F, Olayiwola T, Afagwu C. 2021. Application of artificial intelligence–based predictive methods in ionic liquid studies: a review. Fluid Phase Equilib. 531:112898
    [Google Scholar]
  65. 65.
    Muratov EN, Bajorath J, Sheridan RP, Tetko IV, Filimonov D et al. 2020. QSAR without borders. Chem. Soc. Rev. 49:3525–64
    [Google Scholar]
  66. 66.
    Yousefinejad S, Hemmateenejad B. 2015. Chemometrics tools in QSAR/QSPR studies: a historical perspective. Chemom. Intell. Lab. Syst. 149:177–204
    [Google Scholar]
  67. 67.
    Dearden JC. 2003. Quantitative structure-property relationships for prediction of boiling point, vapor pressure, and melting point. Environ. Toxicol. Chem. 22:1696–709
    [Google Scholar]
  68. 68.
    McCulloch WS, Pitts W. 1943. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biol. 5:115–33
    [Google Scholar]
  69. 69.
    Koutsoukos S, Philippi F, Malaret F, Welton T. 2021. A review on machine learning algorithms for the ionic liquid chemical space. Chem. Sci. 12:6820–43
    [Google Scholar]
  70. 70.
    Soleimani R, Saeedi Dehaghani AH, Shoushtari NA, Yaghoubi P, Bahadori A 2018. Toward an intelligent approach for predicting surface tension of binary mixtures containing ionic liquids. Korean J. Chem. Eng. 35:1556–69
    [Google Scholar]
  71. 71.
    Al-Marhoun M, Nizamuddin S, Raheem AA, Ali SS, Muhammadain A. 2012. Prediction of crude oil viscosity curve using artificial intelligence techniques. J. Pet. Sci. Eng. 86:111–17
    [Google Scholar]
  72. 72.
    Kearnes S, McCloskey K, Berndl M, Pande V, Riley P. 2016. Molecular graph convolutions: moving beyond fingerprints. J. Comput.-Aided Mol. Des. 30:595–608
    [Google Scholar]
  73. 73.
    Wu Z, Pan S, Chen F, Long G, Zhang C, Philip SY. 2020. A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. 32:4–24
    [Google Scholar]
  74. 74.
    Zhou J, Cui G, Hu S, Zhang Z, Yang C et al. 2020. Graph neural networks: a review of methods and applications. AI Open 1:57–81
    [Google Scholar]
  75. 75.
    Medina EIS, Linke S, Stoll M, Sundmacher K. 2022. Graph neural networks for the prediction of infinite dilution activity coefficients. Digit. Discov. 1:216–25
    [Google Scholar]
  76. 76.
    Qin S, Jiang S, Li J, Balaprakash P, Van Lehn R, Zavala V. 2022. Capturing molecular interactions in graph neural networks: a case study in multi-component phase equilibrium. ChemRxiv 2022-3tq4c. https://doi.org/10.26434/chemrxiv-2022-3tq4c
  77. 77.
    Hamzehie M, Fattahi M, Najibi H, Van der Bruggen B, Mazinani S. 2015. Application of artificial neural networks for estimation of solubility of acid gases (H2S and CO2) in 32 commonly ionic liquid and amine solutions. J. Nat. Gas Sci. Eng. 24:106–14
    [Google Scholar]
  78. 78.
    Nimmanterdwong P, Changpun R, Janthboon P, Nakrak S, Gao H et al. 2021. Applied artificial neural network for hydrogen sulfide solubility in natural gas purification. ACS Omega 6:31321–29
    [Google Scholar]
  79. 79.
    Balchandani SC, Dey A. 2022. Prediction of CO2 solubility in potential blends of ionic liquids with alkanolamines using statistical non-rigorous and ANN based modeling: a comprehensive simulation study for post combustion CO2 capture. Int. Commun. Heat Mass Transf. 132:105866
    [Google Scholar]
  80. 80.
    Taghizadehfard M, Hosseini S, Pierantozzi M, Alavianmehr M. 2019. Predicting the volumetric properties of pure and mixture of amino acid–based ionic liquids. J. Mol. Liq. 294:111604
    [Google Scholar]
  81. 81.
    Shahabi-Ghahfarokhy A, Nakhaei-Kohani R, Amar MN, Hemmati-Sarapardeh A. 2022. Modelling density of pure and binary mixtures of normal alkanes: comparison of hybrid soft computing techniques, gene expression programming, and equations of state. J. Pet. Sci. Eng. 208:109737
    [Google Scholar]
  82. 82.
    Jbari Y, Abderafi S. 2022. Liquid density prediction of ethanol/water, using artificial neural network. Biointerface Res. Appl. Chem. 12:5625–37
    [Google Scholar]
  83. 83.
    Liu Y, Hong W, Cao B 2019. Machine learning for predicting thermodynamic properties of pure fluids and their mixtures. Energy 188:116091
    [Google Scholar]
  84. 84.
    Li B, Feng L, Dai Y. 2021. Representation of vapor-liquid equilibria properties for binary mixtures containing R1234ze(E) using machine learning models. J. Phase Equilib. Diffus. 42:231–44
    [Google Scholar]
  85. 85.
    Oprisiu I, Varlamova E, Muratov E, Artemenko A, Marcou G et al. 2012. QSPR approach to predict nonadditive properties of mixtures. Application to bubble point temperatures of binary mixtures of liquids. Mol. Inf. 31:491–502
    [Google Scholar]
  86. 86.
    Wu T, Li WL, Chen MY, Zhou YM, Zhang QY. 2021. Prediction of Henry's law constants of CO2 in imidazole ionic liquids using machine learning methods based on empirical descriptors. Chem. Pap. 75:1619–28
    [Google Scholar]
  87. 87.
    Ashraf C, Joshi N, Beck DAC, Pfaendtner J. 2021. Data science in chemical engineering: applications to molecular science. Annu. Rev. Chem. Biomol. Eng. 12:15–37
    [Google Scholar]
  88. 88.
    Boobier S, Hose DR, Blacker AJ, Nguyen BN. 2020. Machine learning with physicochemical relationships: solubility prediction in organic solvents and water. Nat. Commun. 11:5753
    [Google Scholar]
  89. 89.
    Todeschini R, Consonni V. 2008. Handbook of Molecular Descriptors New York: Wiley
  90. 90.
    Xue L, Bajorath J. 2000. Molecular descriptors in chemoinformatics, computational combinatorial chemistry, and virtual screening. Comb. Chem. High Throughput Screen. 3:363–72
    [Google Scholar]
  91. 91.
    Weininger D. 1988. SMILES, a chemical language and information system. 1. Introduction to methodology and encoding rules. J. Chem. Inf. Comput. Sci. 28:31–36
    [Google Scholar]
  92. 92.
    Heller SR, McNaught A, Pletnev I, Stein S, Tchekhovskoi D. 2015. InChI, the IUPAC international chemical identifier. J. Cheminform. 7:23
    [Google Scholar]
  93. 93.
    Mullins E, Oldland R, Liu Y, Wang S, Sandler SI et al. 2006. Sigma-profile database for using COSMO-based thermodynamic methods. Ind. Eng. Chem. Res. 45:4389–415
    [Google Scholar]
  94. 94.
    Karelson M, Lobanov VS, Katritzky AR. 1996. Quantum-chemical descriptors in QSAR/QSPR studies. Chem. Rev. 96:1027–44
    [Google Scholar]
  95. 95.
    Resnick P, Varian HR. 1997. Recommender systems. Commun. ACM 40:56–58
    [Google Scholar]
  96. 96.
    Zhang S, Yao L, Sun A, Tay Y. 2019. Deep learning based recommender system: a survey and new perspectives. ACM Comput. Surv. 52:5
    [Google Scholar]
  97. 97.
    Sohns JT, Schmitt M, Jirasek F, Hasse H, Leitte H. 2021. Attribute-based explanation of non-linear embeddings of high-dimensional data. IEEE Trans. Vis. Comput. Graph. 28:540–50
    [Google Scholar]
  98. 98.
    Candès EJ, Recht B. 2009. Exact matrix completion via convex optimization. Found. Comput. Math. 9:717–72
    [Google Scholar]
  99. 99.
    Jirasek F, Alves RAS, Damay J, Vandermeulen RA, Bamler R et al. 2020. Machine learning in thermodynamics: prediction of activity coefficients by matrix completion. J. Phys. Chem. Lett. 11:981–85
    [Google Scholar]
  100. 100.
    Jirasek F, Bamler R, Mandt S. 2020. Hybridizing physical and data-driven prediction methods for physicochemical properties. Chem. Commun. 56:12407–10
    [Google Scholar]
  101. 101.
    Damay J, Jirasek F, Kloft M, Bortz M, Hasse H. 2021. Predicting activity coefficients at infinite dilution for varying temperatures by matrix completion. Ind. Eng. Chem. Res. 60:14564–78
    [Google Scholar]
  102. 102.
    Jirasek F, Bamler R, Fellenz S, Bortz M, Kloft M et al. 2022. Making thermodynamic models of mixtures predictive by machine learning: matrix completion of pair interactions. Chem. Sci. 13:4854–62
    [Google Scholar]
  103. 103.
    Chen G, Song Z, Qi Z, Sundmacher K. 2021. Neural recommender system for the activity coefficient prediction and UNIFAC model extension of ionic liquid–solute systems. AIChE J. 67:e17171
    [Google Scholar]
  104. 104.
    Bac S, Quiton SJ, Kron KJ, Chae J, Mitra U, Mallikarjun Sharada S 2022. A matrix completion algorithm for efficient calculation of quantum and variational effects in chemical reactions. J. Chem. Phys. 156:184119
    [Google Scholar]
  105. 105.
    Tan T, Cheng H, Chen G, Song Z, Qi Z. 2022. Prediction of infinite-dilution activity coefficients with neural collaborative filtering. AIChE J. 68:e17789
    [Google Scholar]
  106. 106.
    Hayer N, Jirasek F, Hasse H. 2022. Prediction of Henry's law constants by matrix completion. AIChE J. 68:e17753
    [Google Scholar]
  107. 107.
    Großmann O, Bellaire D, Hayer N, Jirasek F, Hasse H. 2022. Database for diffusion coefficients at infinite dilution at 298 K and matrix completion methods for their prediction. Digit. Discov. 6:886–97
    [Google Scholar]
  108. 108.
    Zhang X, Sethi S, Wang Z, Zhou T, Qi Z, Sundmacher K. 2022. A neural recommender system for efficient adsorbent screening. Chem. Eng. Sci. 259:117801
    [Google Scholar]
  109. 109.
    Duchowicz PR, Aranda JF, Bacelo DE, Fioressi SE. 2020. QSPR study of the Henry's law constant for heterogeneous compounds. Chem. Eng. Res. Des. 154:115–21
    [Google Scholar]
  110. 110.
    Horstmann S, Jabłoniec A, Krafczyk J, Fischer K, Gmehling J. 2005. PSRK group contribution equation of state: comprehensive revision and extension IV, including critical constants and α-function parameters for 1000 components. Fluid Phase Equilib. 227:157–64
    [Google Scholar]
  111. 111.
    Truc G, Rahmanian N, Pishnamazi M 2021. Assessment of cubic equations of state: machine learning for rich carbon-dioxide systems. Sustainability 13:2527
    [Google Scholar]
  112. 112.
    Farzaneh-Gord M, Mohseni-Gharyehsafa B, Arabkoohsar A, Ahmadi MH, Sheremet MA. 2020. Precise prediction of biogas thermodynamic properties by using ANN algorithm. Renew. Energy 147:179–91
    [Google Scholar]
  113. 113.
    Farzaneh-Gord M, Mohseni-Gharyehsafa B, Ebrahimi-Moghadam A, Jabari-Moghadam A, Toikka A, Zvereva I. 2018. Precise calculation of natural gas sound speed using neural networks: an application in flow meter calibration. Flow Meas. Instrum. 64:90–103
    [Google Scholar]
  114. 114.
    Alvarez VH, Saldaña MD. 2012. Thermodynamic prediction of vapor–liquid equilibrium of supercritical CO3 or CHF3 + ionic liquids. J. Supercrit. Fluids 66:29–35
    [Google Scholar]
  115. 115.
    Allers JP, Priest CW, Greathouse JA, Alam TM. 2021. Using computationally-determined properties for machine learning prediction of self-diffusion coefficients in pure liquids. J. Phys. Chem. B 125:12990–3002
    [Google Scholar]
  116. 116.
    Arcaklioğlu E. 2004. Performance comparison of CFCs with their substitutes using artificial neural network. Int. J. Energy Res. 28:1113–25
    [Google Scholar]
  117. 117.
    Blei DM, Kucukelbir A, McAuliffe JD. 2017. Variational inference: a review for statisticians. J. Am. Stat. Assoc. 112:859–77
    [Google Scholar]
  118. 118.
    Hinton G, Vinyals O, Dean J. 2015. Distilling the knowledge in a neural network. arXiv:1503.02531 [stat.ML]
  119. 119.
    Dietterich TG. 2000. Ensemble methods in machine learning. Proceedings of the 1st International Workshop on Multiple Classifier Systems1–15. Berlin: Springer
    [Google Scholar]
  120. 120.
    Breiman L. 1996. Bagging predictors. Mach. Learn. 24:123–40
    [Google Scholar]
  121. 121.
    Schapire RE. 1990. The strength of weak learnability. Mach. Learn. 5:197–227
    [Google Scholar]
  122. 122.
    Evans R, Deng Z, Rogerson AK, McLachlan AS, Richards JJ et al. 2013. Quantitative interpretation of diffusion-ordered NMR spectra: Can we rationalize small molecule diffusion coefficients?. Angew. Chem. Int. Ed. 52:3199–202
    [Google Scholar]
  123. 123.
    Nakhaei-Kohani R, Taslimi-Renani E, Hadavimoghaddam F, Mohammadi MR, Hemmati-Sarapardeh A. 2022. Modeling solubility of CO2–N2 gas mixtures in aqueous electrolyte systems using artificial intelligence techniques and equations of state. Sci. Rep. 12:3625
    [Google Scholar]
  124. 124.
    Kanti P, Sharma K, Jamei M, Kumar HP. 2021. Thermal performance of hybrid fly ash and copper nanofluid in various mixture ratios: experimental investigation and application of a modern ensemble machine learning approach. Int. Commun. Heat Mass Transf. 129:105731
    [Google Scholar]
  125. 125.
    Abrams DS, Prausnitz JM. 1975. Statistical thermodynamics of liquid mixtures: a new expression for the excess Gibbs energy of partly or completely miscible systems. AIChE J. 21:116–28
    [Google Scholar]
  126. 126.
    Maurer G, Prausnitz J. 1978. On the derivation and extension of the UNIQUAC equation. Fluid Phase Equilib. 2:91–99
    [Google Scholar]
  127. 127.
    Handley CM, Popelier PL. 2010. Potential energy surfaces fitted by artificial neural networks. J. Phys. Chem. A 114:3371–83
    [Google Scholar]
  128. 128.
    Gao P, Yang X, Tartakovsky AM 2020. Learning coarse-grained potentials for binary fluids. J. Chem. Inf. Model. 60:3731–45
    [Google Scholar]
  129. 129.
    Deringer VL, Caro MA, Csányi G. 2019. Machine learning interatomic potentials as emerging tools for materials science. Adv. Mater. 31:1902765
    [Google Scholar]
  130. 130.
    Unke OT, Chmiela S, Sauceda HE, Gastegger M, Poltavsky I et al. 2021. Machine learning force fields. Chem. Rev. 121:10142–86
    [Google Scholar]
  131. 131.
    Abudour AM, Mohammad SA, Robinson RL Jr., Gasem KA. 2014. Generalized binary interaction parameters for the Peng–Robinson equation of state. Fluid Phase Equilib. 383:156–73
    [Google Scholar]
  132. 132.
    Laidi M, Hanini S et al. 2022. AI-PCSAFT approach: new high predictive method for estimating PC-SAFT pure component properties and phase equilibria parameters. Fluid Phase Equilib. 555:113297
    [Google Scholar]
  133. 133.
    Jirasek F, Hayer N, Abbas R, Schmid B, Hasse H. 2023. Prediction of parameters of group contribution models of mixtures by matrix completion. Phys. Chem. Chem. Phys. 25:1054–62
    [Google Scholar]
  134. 134.
    Raissi M, Perdikaris P, Karniadakis GE. 2019. Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 378:686–707
    [Google Scholar]
  135. 135.
    Li R, Wang JX, Lee E, Luo T 2022. Physics-informed deep learning for solving phonon Boltzmann transport equation with large temperature non-equilibrium. npj Comput. Mater. 8:29
    [Google Scholar]
  136. 136.
    Martin J, Schaub H. 2022. Physics-informed neural networks for gravity field modeling of the Earth and Moon. Celest. Mech. Dyn. Astron. 134:13
    [Google Scholar]
  137. 137.
    Molnar JP, Grauer SJ. 2022. Flow field tomography with uncertainty quantification using a Bayesian physics-informed neural network. Meas. Sci. Technol. 33:065305
    [Google Scholar]
  138. 138.
    Xu K, Darve E. 2022. Physics constrained learning for data-driven inverse modeling from sparse observations. J. Comput. Phys. 453:110938
    [Google Scholar]
  139. 139.
    Márquez-Neila P, Salzmann M, Fua P. 2017. Imposing hard constraints on deep networks: promises and limitations. arXiv:1706.02025 [cs.CV]
  140. 140.
    Lu L, Pestourie R, Yao W, Wang Z, Verdugo F, Johnson SG. 2021. Physics-informed neural networks with hard constraints for inverse design. SIAM J. Sci. Comput. 43:B1105–32
    [Google Scholar]
/content/journals/10.1146/annurev-chembioeng-092220-025342
Loading
/content/journals/10.1146/annurev-chembioeng-092220-025342
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error