1932

Abstract

Deep neural network models have become ubiquitous in recent years and have been applied to nearly all areas of science, engineering, and industry. These models are particularly useful for data that have strong dependencies in space (e.g., images) and time (e.g., sequences). Indeed, deep models have also been extensively used by the statistical community to model spatial and spatiotemporal data through, for example, the use of multilevel Bayesian hierarchical models and deep Gaussian processes. In this review, we first present an overview of traditional statistical and machine learning perspectives for modeling spatial and spatiotemporal data, and then focus on a variety of hybrid models that have recently been developed for latent process, data, and parameter specifications. These hybrid models integrate statistical modeling ideas with deep neural network models in order to take advantage of the strengths of each modeling paradigm. We conclude by giving an overview of computational technologies that have proven useful for these hybrid models, and with a brief discussion on future research directions.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-statistics-033021-112628
2023-03-09
2024-04-30
Loading full text...

Full text loading...

/deliver/fulltext/statistics/10/1/annurev-statistics-033021-112628.html?itemId=/content/journals/10.1146/annurev-statistics-033021-112628&mimeType=html&fmt=ahah

Literature Cited

  1. Abdar M, Pourpanah F, Hussain S, Rezazadegan D, Liu L et al. 2021. A review of uncertainty quantification in deep learning: techniques, applications and challenges. Inf. Fusion 76:243–97
    [Google Scholar]
  2. Amato F, Guignard F, Robert S, Kanevski M 2020. A novel framework for spatio-temporal prediction of environmental data using deep learning. Sci. Rep. 10:22243
    [Google Scholar]
  3. Bai L, Yao L, Kanhere S, Wang X, Sheng Q et al. 2019. STG2Seq: spatial-temporal graph to sequence model for multi-step passenger demand forecasting. arXiv:1905.10069 [cs.LG]
  4. Banerjee S, Carlin BP, Gelfand AE. 2014. Hierarchical Modeling and Analysis for Spatial Data Boca Raton, FL: CRC
  5. Bhatnagar S, Chang W, Kim S, Wang J 2022. Computer model calibration with time series data using deep learning and quantile regression. SIAM J. Uncert. Quant. 10:1 https://doi.org/10.1137/20M1382581
    [Google Scholar]
  6. Bolin D, Lindgren F. 2011. Spatial models generated by nested stochastic partial differential equations, with an application to global ozone mapping. Ann. Appl. Stat. 5:523–50
    [Google Scholar]
  7. Bonas M, Castruccio S. 2021. Calibration of spatial forecasts from citizen science urban air pollution data with sparse recurrent neural networks. arXiv:2105.02971 [stat.AP]
  8. Box GE, Cox DR. 1964. An analysis of transformations. J. R. Stat. Soc. B 26:2211–43
    [Google Scholar]
  9. Bradley JR. 2022. Joint Bayesian analysis of multiple response-types using the hierarchical generalized transformation model. Bayesian Anal. 17:1127–64
    [Google Scholar]
  10. Calandra R, Peters J, Rasmussen CE, Deisenroth MP. 2016. Manifold Gaussian processes for regression. Proceedings of the 2016 International Joint Conference on Neural Networks (IJCNN)3338–45. Piscataway, NJ: IEEE
    [Google Scholar]
  11. Cartwright L, Zammit-Mangion A, Deutscher N. 2022. Emulation of greenhouse-gas sensitivities using variational autoencoders. Environmetricse2754 https://doi.org/10.1002/env.2754
    [Google Scholar]
  12. Castruccio S, McInerney DJ, Stein ML, Liu Crouch F, Jacob RL, Moyer EJ 2014. Statistical emulation of climate model projections based on precomputed GCM runs. J. Clim. 27:51829–44
    [Google Scholar]
  13. Chang W, Haran M, Applegate P, Pollard D. 2016. Calibrating an ice sheet model using high-dimensional binary spatial data. J. Am. Stat. Assoc. 111:51357–72
    [Google Scholar]
  14. Chattopadhyay A, Mustafa M, Hassanzadeh P, Bach E, Kashinath K. 2022. Towards physics-inspired data-driven weather forecasting: integrating data assimilation with a deep spatial-transformer-based U-NET in a case study with ERA5. Geosci. Model Dev. 15:2221–37
    [Google Scholar]
  15. Chen RTQ, Rubanova Y, Bettencourt J, Duvenaud D. 2018. Neural ordinary differential equations. Proceedings of the 32nd Conference on Neural Information Processing Systems (NeurIPS 2018)6572–83. Red Hook, NY: Curran
    [Google Scholar]
  16. Chen RTQ, Amos B, Nickel M. 2021. Neural spatio-temporal point processes. Proceedings of the 9th International Conference on Learning Representations (ICLR 2021) Amherst, MA: OpenReview https://openreview.net/forum?id=XQQA6-So14
    [Google Scholar]
  17. Chen W, Li Y, Reich BJ, Sun Y. 2021. DeepKriging: spatially dependent deep neural networks for spatial prediction. arXiv:2007.11972 [stat.ML]
  18. Cressie N. 1978. The exponential and power data transformations. J. R. Stat. Soc. D 27:157–60
    [Google Scholar]
  19. Cressie N. 1993. Statistics for Spatial Data Hoboken, NJ: Wiley
  20. Cressie N, Wikle CK. 2011. Statistics for Spatio-Temporal Data Hoboken, NJ: Wiley
  21. Damianou A, Lawrence N. 2013. Deep Gaussian processes. Proc. Mach. Learn. Res. 31:207–15
    [Google Scholar]
  22. de Bézenac E, Pajot A, Gallinari P. 2019. Deep learning for physical processes: incorporating prior scientific knowledge. J. Stat. Mech. Theory Exp. 2019:124009
    [Google Scholar]
  23. De Oliveira V, Kedem B, Short DA. 1997. Bayesian prediction of transformed Gaussian random fields. J. Am. Stat. Assoc. 92:4401422–33
    [Google Scholar]
  24. Diggle PJ, Tawn JA, Moyeed RA. 1998. Model-based geostatistics. J. R. Stat. Soc. C 47:3299–350
    [Google Scholar]
  25. Dixon MF. 2021. Industrial forecasting with exponentially smoothed recurrent neural networks. Technometrics 64:1114–24
    [Google Scholar]
  26. Dixon MF, Polson NG, Sokolov VO. 2019. Deep learning for spatio-temporal modeling: dynamic traffic flows and high frequency trading. Appl. Stoch. Models Bus. Ind. 35:3788–807
    [Google Scholar]
  27. Duchi J, Hazan E, Singer Y. 2011. Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12:72121–59
    [Google Scholar]
  28. Dunlop MM, Girolami MA, Stuart AM, Teckentrup AL. 2018. How deep are deep Gaussian processes?. J. Mach. Learn. Res. 19:12100–45
    [Google Scholar]
  29. Duvenaud D, Rippel O, Adams R, Ghahramani Z 2014. Avoiding pathologies in very deep networks. Proc. Mach. Learn. Res. 33:202–10
    [Google Scholar]
  30. Fan J, Ma C, Zhong Y. 2021. A selective overview of deep learning. Stat. Sci. 36:2264–90
    [Google Scholar]
  31. Gal Y, Ghahramani Z. 2016. Dropout as a Bayesian approximation: representing model uncertainty in deep learning. Proc. Mach. Learn. Res. 48:1050–59
    [Google Scholar]
  32. Gerber F, Nychka D. 2021. Fast covariance parameter estimation of spatial Gaussian process models using neural networks. Stat 10:1e382
    [Google Scholar]
  33. Gopalan G, Wikle CK. 2022. A higher-order singular value decomposition tensor emulator for spatiotemporal simulators. J. Agric. Biol. Environ. Stat. 27:122–45
    [Google Scholar]
  34. Gramacy RB, Lee HKH. 2008. Bayesian treed Gaussian process models with an application to computer modeling. J. Am. Stat. Assoc. 103:4831119–30
    [Google Scholar]
  35. Gunning D, Stefik M, Choi J, Miller T, Stumpf S, Yang GZ. 2019. XAI—explainable artificial intelligence. Sci. Robot. 4:37aay7120
    [Google Scholar]
  36. Guo S, Lin Y, Li S, Chen Z, Wan H 2019. Deep spatial–temporal 3D convolutional neural networks for traffic data forecasting. IEEE Trans. Intell. Transport. Syst. 20:103913–26
    [Google Scholar]
  37. Heaton MJ, Datta A, Finley AO, Furrer R, Guinness J et al. 2019. A case study competition among methods for analyzing large spatial data. J. Agric. Biol. Environ. Stat. 24:3398–425
    [Google Scholar]
  38. Henderson P, Islam R, Bachman P, Pineau J, Precup D, Meger D. 2018. Deep reinforcement learning that matters. Proceedings of the 32nd AAAI Conference on Artificial Intelligence and 30th Innovative Applications of Artificial Intelligence Conference and 8th AAAI Symposium on Educational Advances in Artificial Intelligence (AAAI'18/IAAI'18/EAAI'18)3207–14. Palo Alto, CA: AAAI
    [Google Scholar]
  39. Hensman J, Fusi N, Lawrence ND. 2013. Gaussian processes for big data. arXiv:1309.6835 [cs.LG]
  40. Hooten MB, Leeds WB, Fiechter J, Wikle CK. 2011. Assessing first-order emulator inference for physical parameters in nonlinear mechanistic models. J. Agric. Biol. Environ. Stat. 16:4475–94
    [Google Scholar]
  41. Huang CW, Krueger D, Lacoste A, Courville A. 2018. Neural autoregressive flows. Proc. Mach. Learn. Res. 80:2078–87
    [Google Scholar]
  42. Huang H, Castruccio S, Genton MG. 2021. Forecasting high-frequency spatio-temporal wind power with dimensionally reduced echo state networks. J. R. Stat. Soc. C 71:2449–66
    [Google Scholar]
  43. Huang Y, Li J, Shi M, Zhuang H, Zhu X et al. 2021. ST-PCNN: spatio-temporal physics-coupled neural networks for dynamics forecasting. arXiv:2108.05940 [cs.LG]
  44. Jaeger H. 2001. The “echo state” approach to analysing and training recurrent neural networks—with an erratum note GMD Rep. 148, Ger. Natl. Res. Cent. Inf. Technol., St. Augustin Ger:.
  45. Jaeger H. 2007a. Discovering multiscale dynamical features with hierarchical echo state networks. Tech. Rep. 10, Sch. Eng. Sci., Jacobs Univ. Bremen, Ger:.
  46. Jaeger H. 2007b. Echo state network. Scholarpedia 2:92330
    [Google Scholar]
  47. Jia J, Benson AR. 2019. Neural jump stochastic differential equations. Proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS 2019)9847–58. Red Hook, NY: Curran
    [Google Scholar]
  48. Kennedy MC, O'Hagan A. 2001. Bayesian calibration of computer models. J. R. Stat. Soc. B 63:3425–64
    [Google Scholar]
  49. Kingma DP, Ba J. 2014. Adam: a method for stochastic optimization. arXiv:1412.6980 [cs.LG]
  50. Kingma DP, Salimans T, Jozefowicz R, Chen X, Sutskever I, Welling M. 2016. Improved variational inference with inverse autoregressive flow. Proceedings of the 30th Conference on Neural Information Processing Systems (NeurIPS 2016)4743–51. Red Hook, NY: Curran
    [Google Scholar]
  51. Kirkwood C, Economou T, Pugeault N. 2020. Bayesian deep learning for mapping via auxiliary information: a new era for geostatistics?. arXiv:2008.07320 [stat.ML]
  52. Klein N, Smith MS, Nott DJ. 2020. Deep distributional time series models and the probabilistic forecasting of intraday electricity prices. arXiv:2010.01844 [stat.ME]
  53. Leeds W, Wikle C, Fiechter J, Brown J, Milliff R. 2013. Modeling 3-D spatio-temporal biogeochemical processes with a forest of 1-D statistical emulators. Environmetrics 24:11–12
    [Google Scholar]
  54. Leeds WB, Wikle CK, Fiechter J. 2014. Emulator-assisted reduced-rank ecological data assimilation for nonlinear multivariate dynamical spatio-temporal processes. Stat. Methodol. 17:126–38
    [Google Scholar]
  55. Lenzi A, Bessac J, Rudi J, Stein ML. 2021. Neural networks for parameter estimation in intractable models. arXiv:2107.14346 [stat.ME]
  56. Ma X, Li J, Kochenderfer MJ, Isele D, Fujimura K. 2021. Reinforcement learning for autonomous driving with latent state inference and spatial-temporal relationships. 2021 IEEE International Conference on Robotics and Automation (ICRA)6064–71. Piscataway, NJ: IEEE
    [Google Scholar]
  57. Marmin S, Filippone M. 2022. Deep Gaussian processes for calibration of computer models. Bayesian Anal. In press. https://doi.org/10.1214/21-BA1293
    [Google Scholar]
  58. Maroñas J, Hamelijnck O, Knoblauch J, Damoulas T. 2021. Transforming Gaussian processes with normalizing flows. Proc. Mach. Learn. Res. 130:1081–89
    [Google Scholar]
  59. McDermott PL, Wikle CK. 2017. An ensemble quadratic echo state network for non-linear spatio-temporal forecasting. Stat 6:1315–30
    [Google Scholar]
  60. McDermott PL, Wikle CK. 2019a. Bayesian recurrent neural network models for forecasting and quantifying uncertainty in spatial-temporal data. Entropy 21:2184
    [Google Scholar]
  61. McDermott PL, Wikle CK. 2019b. Deep echo state networks with uncertainty quantification for spatio-temporal forecasting. Environmetrics 30:3e2553
    [Google Scholar]
  62. Ming D, Williamson D, Guillas S. 2022. Deep Gaussian process emulation using stochastic imputation. Technometrics https://doi.org/10.1080/00401706.2022.2124311
    [Google Scholar]
  63. Mohan AT, Lubbers N, Livescu D, Chertkov M. 2020. Embedding hard physical constraints in convolutional neural networks for 3D turbulence. Poster presented at ICLR 2020 Workshop on Integration of Deep Neural Models and Differential Equations Addis Ababa: Dec. 23. https://openreview.net/pdf?id=IaXBtMNFaa
  64. Molnar C 2022. Interpretable Machine Learning https://christophm.github.io/interpretable-ml-book/
  65. Momenifar M, Diao E, Tarokh V, Bragg AD. 2022. A physics-informed vector quantized autoencoder for data compression of turbulent flow. 2022 Data Compression Conference (DCC)1–10. Piscataway, NJ: IEEE
    [Google Scholar]
  66. Monterrubio-Gómez K, Roininen L, Wade S, Damoulas T, Girolami M. 2020. Posterior inference for sparse hierarchical non-stationary models. Comput. Stat. Data Anal. 148:106954
    [Google Scholar]
  67. Murakami D, Kajita M, Kajita S, Matsui T. 2021. Compositionally-warped additive mixed modeling for a wide variety of non-Gaussian spatial data. Spatial Stat. 43:100520
    [Google Scholar]
  68. Neal RM. 1996. Bayesian Learning for Neural Networks New York: Springer
  69. Ng AY, Russell S 2000. Algorithms for inverse reinforcement learning. Proceedings of the 17th International Conference on Machine Learning663–70. San Francisco, CA: Morgan Kaufmann
    [Google Scholar]
  70. Ng TLJ, Zammit-Mangion A. 2022. Spherical Poisson point process intensity function modeling and estimation with measure transport. Spatial Stat. 50:100629
    [Google Scholar]
  71. Ng TLJ, Zammit-Mangion A. 2023. Non-homogeneous Poisson process intensity modeling and estimation using measure transport. Bernoulli 29:815–38
    [Google Scholar]
  72. Nguyen H, Cressie N, Braverman A. 2017. Multivariate spatial data fusion for very large remote sensing datasets. Remote Sens. 9:2142
    [Google Scholar]
  73. Nguyen N, Tran MN, Gunawan D, Kohn R. 2022. A statistical recurrent stochastic volatility model for stock markets. J. Bus. Econ. Stat. https://doi.org/10.1080/07350015.2022.2028631
    [Google Scholar]
  74. North JS, Wikle CK, Schliep EM. 2022. A review of data-driven discovery of dynamic systems. arXiv:2210.10663 [stat.ME]
  75. Oh J, Guo X, Lee H, Lewis R, Singh S. 2015. Action-conditional video prediction using deep networks in Atari games. Proceedings of the 28th Conference on Neural Information Processing Systems (NeurIPS 2015)2863–71. Red Hook, NY: Curran
    [Google Scholar]
  76. Paciorek CJ, Schervish MJ. 2006. Spatial modelling using a new class of nonstationary covariance functions. Environmetrics 17:5483–506
    [Google Scholar]
  77. Papamakarios G, Pavlakou T, Murray I. 2017. Masked autoregressive flow for density estimation. Proceedings of the 31st Conference on Neural Information Processing Systems (NeurIPS 2017)2335–44. Red Hook, NY: Curran
    [Google Scholar]
  78. Parker PA, Holan SH, Wills SA. 2021. A general Bayesian model for heteroskedastic data with fully conjugate full-conditional distributions. J. Stat. Comput. Simul. 91:153207–27
    [Google Scholar]
  79. Paszke A, Gross S, Chintala S, Chanan G, Yang E et al. 2017. Automatic differentiation in PyTorch Paper presented at the Autodiff Workshop at the 31st Conference on Neural Information Processing Systems (NeurIPS 2017) Long Beach, CA: Dec. 9. https://openreview.net/forum?id=BJJsrmfCZ
  80. Perrin O, Monestiez P 1999. Modelling of non-stationary spatial structure using parametric radial basis deformations. GeoENV II: Geostatistics for Environmental Applications J Gómez-Hernández, A Soares, R Froidevaux 175–86. New York: Springer
    [Google Scholar]
  81. Quiñonero-Candela J, Rasmussen CE. 2005. A unifying view of sparse approximate Gaussian process regression. J. Mach. Learn. Res. 6:1939–59
    [Google Scholar]
  82. R Core Team 2022. R: a language and environment for statistical computing Statistical Software, R Found. Stat. Comput. Vienna:
  83. Raissi M, Yazdani A, Karniadakis GE. 2020. Hidden fluid mechanics: learning velocity and pressure fields from flow visualizations. Science 367:64811026–30
    [Google Scholar]
  84. Reichstein M, Camps-Valls G, Stevens B, Jung M, Denzler J et al. 2019. Deep learning and process understanding for data-driven earth system science. Nature 566:7743195–204
    [Google Scholar]
  85. Rezende DJ, Mohamed S 2015. Variational inference with normalizing flows. Proc. Mach. Learn. Res. 37:1530–38
    [Google Scholar]
  86. Rezende DJ, Papamakarios G, Racaniere S, Albergo M, Kanwar G et al. 2020. Normalizing flows on tori and spheres. Proc. Mach. Learn. Res. 119:8083–92
    [Google Scholar]
  87. Rios G, Tobar F. 2019. Compositionally-warped Gaussian processes. Neural Netw. 118:235–46
    [Google Scholar]
  88. Rudi J, Bessac J, Lenzi A. 2021. Parameter estimation with dense and convolutional neural networks applied to the FitzHugh-Nagumo ODE. Proc. Mach. Learn. Res. 145:781–808
    [Google Scholar]
  89. Rudin C, Chen C, Chen Z, Huang H, Semenova L, Zhong C. 2022. Interpretable machine learning: fundamental principles and 10 grand challenges. Stat. Surv. 16:1–85
    [Google Scholar]
  90. Salimbeni H, Deisenroth M. 2017. Doubly stochastic variational inference for deep Gaussian processes. Proceedings of the 31st Conference on Neural Information Processing Systems (NeurIPS 2017)4588–99. Red Hook, NY: Curran
    [Google Scholar]
  91. Sainsbury-Dale M, Zammit-Mangion A, Huser R. 2022. Fast optimal estimation with intractable models using permutation-invariant neural networks. arXiv:2208.12942 [stat.ME]
  92. Samek W, Montavon G, Lapuschkin S, Anders CJ, Müller KR. 2021. Explaining deep neural networks and beyond: a review of methods and applications. Proc. IEEE 109:3247–78
    [Google Scholar]
  93. Sampson PD, Guttorp P. 1992. Nonparametric estimation of nonstationary spatial covariance structure. J. Am. Stat. Assoc. 87:417108–19
    [Google Scholar]
  94. Särkkä S, Solin A. 2019. Applied Stochastic Differential Equations, Vol. 10 Cambridge, UK: Cambridge Univ. Press
  95. Sauer A, Cooper A, Gramacy RB. 2022. Vecchia-approximated deep Gaussian processes for computer experiments. J. Comput. Graph. Stat. https://doi.org/10.1080/10618600.2022.2129662
    [Google Scholar]
  96. Schafer TL, Wikle CK, Hooten MB. 2022. Bayesian inverse reinforcement learning for collective animal movement. Ann. Appl. Stat. 16:999–1013
    [Google Scholar]
  97. Schmidt AM, O'Hagan A. 2003. Bayesian inference for non-stationary spatial covariance structure via spatial deformations. J. R. Stat. Soc. B 65:3743–58
    [Google Scholar]
  98. Sei T. 2013. A Jacobian inequality for gradient maps on the sphere and its application to directional statistics. Commun. Stat. Theory Methods 42:142525–42
    [Google Scholar]
  99. Shi X, Chen Z, Wang H, Yeung DY, Wong WK, Woo WC. 2015. Convolutional LSTM network: a machine learning approach for precipitation nowcasting. Proceedings of the 28th Conference on Neural Information Processing Systems (NeurIPS 2015)802–10. Cambridge, MA: MIT Press
    [Google Scholar]
  100. Sidén P, Lindsten F. 2020. Deep Gaussian Markov random fields. Proc. Mach. Learn. Res. 119:8916–26
    [Google Scholar]
  101. Smith RL. 1996. Estimating nonstationary spatial correlations. Work. Pap., Cambridge Univ./Univ. N. C. Chapel Hill
  102. Snelson E, Rasmussen CE, Ghahramani Z. 2004. Warped Gaussian processes. Proceedings of the 16th Conference on Neural Information Processing Systems (NeurIPS 2003)337–44. Cambridge, MA: MIT Press
    [Google Scholar]
  103. Snoek J, Swersky K, Zemel R, Adams R 2014. Input warping for Bayesian optimization of non-stationary functions. Proc. Mach. Learn. Res. 32:1674–82
    [Google Scholar]
  104. Song C, Lin Y, Guo S, Wan H. 2020. Spatial-temporal synchronous graph convolutional networks: a new framework for spatial-temporal network data forecasting. Proceedings of the 34th AAAI Conference on Artificial Intelligence and 32nd Innovative Applications of Artificial Intelligence Conference and 10th AAAI Symposium on Educational Advances in Artificial Intelligence (AAAI'20/IAAI'20/EAAI'20)914–21. Palo Alto, CA: AAAI
    [Google Scholar]
  105. Sutton RS, Barto AG. 1998. Reinforcement Learning: An Introduction Cambridge, MA: MIT Press
  106. Tabak E, Vanden Eijnden E 2010. Density estimation by dual ascent of the log-likelihood. Commun. Math. Sci. 8:1217–33
    [Google Scholar]
  107. Taddy M, Kottas A. 2010. Mixture modeling for marked Poisson processes. Bayesian Anal. 7:2335–62
    [Google Scholar]
  108. Tampuu A, Matiisen T, Kodelja D, Kuzovkin I, Korjus K et al. 2017. Multiagent cooperation and competition with deep reinforcement learning. PLOS ONE 12:4e0172395
    [Google Scholar]
  109. Tran MN, Nguyen N, Nott D, Kohn R. 2020. Bayesian deep net GLM and GLMM. J. Comput. Graph. Stat. 29:197–113
    [Google Scholar]
  110. Tukey JW. 1977. Modern techniques in data analysis Paper presented at NSF-Sponsored Regional Research Conference Southeast. Mass. Univ. North Dartmouth:
  111. Vu Q, Moores MT, Zammit-Mangion A 2022a. Warped gradient-enhanced Gaussian process surrogate models for inference with intractable likelihoods. arXiv:2105.04374 [stat.CO]
  112. Vu Q, Zammit-Mangion A, Chuter SJ. 2022b. Constructing large nonstationary spatio-temporal covariance models via compositional warpings. arXiv:2202.03560 [stat.ME]
  113. Vu Q, Zammit-Mangion A, Cressie N. 2022c. Modeling nonstationary and asymmetric multivariate spatial covariances via deformations. Stat. Sin. 32:2071–93
    [Google Scholar]
  114. Wang H, Guan Y, Reich B. 2019. Nearest-neighbor neural networks for geostatistics. 2019 International Conference on Data Mining Workshops (ICDMW)196–205. Piscataway, NJ: IEEE
    [Google Scholar]
  115. Wang J, Yang Y, Mao J, Huang Z, Huang C, Xu W. 2016. CNN-RNN: a unified framework for multi-label image classification. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition2285–94. Piscataway, NJ: IEEE
    [Google Scholar]
  116. Wikle CK. 2019. Comparison of deep neural networks and deep hierarchical models for spatio-temporal data. J. Agric. Biol. Environ. Stat. 24:2175–203
    [Google Scholar]
  117. Wikle CK, Hooten MB. 2010. A general science-based framework for dynamical spatio-temporal models. Test 19:3417–51
    [Google Scholar]
  118. Wikle CK, Zammit-Mangion A, Cressie N 2019a. Spatio-Temporal Statistics with R Boca Raton, FL: Chapman & Hall/CRC
  119. Wikle CK, Zammit-Mangion A, Cressie N. 2019b. Spatio-temporal statistics with R Supplementary R Package, GitHub. https://github.com/andrewzm/STRbook
  120. Wu JL, Kashinath K, Albert A, Chirila D, Xiao H et al. 2020. Enforcing statistical constraints in generative adversarial networks for modeling chaotic dynamical systems. J. Comput. Phys. 406:109209
    [Google Scholar]
  121. Xu G, Genton MG. 2017. Tukey g-and-h random fields. J. Am. Stat. Assoc. 112:5191236–49
    [Google Scholar]
  122. Yu B, Yin H, Zhu Z. 2018. Spatio-temporal graph convolutional networks: a deep learning framework for traffic forecasting. Proceedings of the 27th International Joint Conference on Artificial Intelligence (IJCAI'18)3634–40. New York: ACM
    [Google Scholar]
  123. Zammit-Mangion A, Ng TLJ, Vu Q, Filippone M. 2021. Deep compositional spatial models. J. Am. Stat. Assoc. https://doi.org/10.1080/01621459.2021.1887741
    [Google Scholar]
  124. Zammit-Mangion A, Wikle CK. 2020. Deep integro-difference equation models for spatio-temporal forecasting. Spatial Stat. 37:100408
    [Google Scholar]
  125. Zhang B, Konomi BA, Sang H, Karagiannis G, Lin G. 2015. Full scale multi-output Gaussian process emulator with nonseparable auto-covariance functions. J. Comput. Phys. 300:623–42
    [Google Scholar]
  126. Zhao Z, Emzir M, Särkkä S. 2021. Deep state-space Gaussian processes. Stat. Comput. 31:751–26
    [Google Scholar]
  127. Zhu S, Li S, Peng Z, Xie Y. 2020. Interpretable deep generative spatio-temporal point processes. Paper presented at the AI for Earth Sciences Workshop at the 34th Conference on Neural Information Processing Systems (NeurIPS 2020), online, Dec. 6–12. https://ai4earthscience.github.io/neurips-2020-workshop/papers/ai4earth_neurips_2020_09.pdf
/content/journals/10.1146/annurev-statistics-033021-112628
Loading
/content/journals/10.1146/annurev-statistics-033021-112628
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error