1932

Abstract

State-space models can be used to incorporate subject knowledge on the underlying dynamics of a time series by the introduction of a latent Markov state process. A user can specify the dynamics of this process together with how the state relates to partial and noisy observations that have been made. Inference and prediction then involve solving a challenging inverse problem: calculating the conditional distribution of quantities of interest given the observations. This article reviews Monte Carlo algorithms for solving this inverse problem, covering methods based on the particle filter and the ensemble Kalman filter. We discuss the challenges posed by models with high-dimensional states, joint estimation of parameters and the state, and inference for the history of the state process. We also point out some potential new developments that will be important for tackling cutting-edge filtering applications.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-statistics-031017-100232
2018-03-07
2024-03-29
Loading full text...

Full text loading...

/deliver/fulltext/statistics/5/1/annurev-statistics-031017-100232.html?itemId=/content/journals/10.1146/annurev-statistics-031017-100232&mimeType=html&fmt=ahah

Literature Cited

  1. Aeberhard WH, Mills Flemming JM, Nielsen A. 2017. Review of state-space models for fisheries science. Annu. Rev. Stat. Appl. 5:215–35 [Google Scholar]
  2. Agapiou S, Papaspiliopoulos O, Sanz-Alonso D, Stuart A. 2017. Importance sampling: computational complexity and intrinsic dimension. Stat. Sci. 32:405–31 [Google Scholar]
  3. Andrieu C, Doucet A, Holenstein R. 2010. Particle Markov chain Monte Carlo (with discussion). J. R. Stat. Soc. B 62:269–342 [Google Scholar]
  4. Andrieu C, Lee A, Vihola M. 2018. Uniform ergodicity of the iterated conditional SMC and geometric ergodicity of particle Gibbs samplers. Bernoulli 24:842–72 [Google Scholar]
  5. Andrieu C, Roberts GO. 2009. The pseudo-marginal approach for efficient Monte Carlo computations. Ann. Stat. 37:697–725 [Google Scholar]
  6. Andrieu C, Vihola M. 2015. Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms. Ann. Appl. Probab. 25:1030–77 [Google Scholar]
  7. Andrieu C, Vihola M. 2016. Establishing some order amongst exact approximations of MCMCs. Ann. Appl. Probab. 26:2661–96 [Google Scholar]
  8. Atar R. 2011. Exponential decay rate of the filter's dependence on the initial distribution. The Oxford Handbook of Nonlinear Filtering D Crisan, B Rozovskiˇi 299–318 Oxford, UK: Oxford Univ. Press [Google Scholar]
  9. Bannister RN. 2017. A review of operational methods of variational and ensemble-variational data assimilation. Q. J. R. Meteorol. Soc. 143:607–33 [Google Scholar]
  10. Bauer P, Thorpe A, Brunet G. 2015. The quiet revolution of numerical weather prediction. Nature 525:47–55 [Google Scholar]
  11. Baum LE, Petrie T, Soules G, Weiss N. 1970. A maximisation technique occurring in the statistical analysis of probabilistic functions of Markov chains. Ann. Math. Stat. 41:164–71 [Google Scholar]
  12. Bengtsson T, Bickel P, Li B. 2008. Curse-of-dimensionality revisited: collapse of the particle filter in very large scale systems. Probability and Statistics: Essays in Honor of David A. Freedman D Nolan, T Speed 316–34 Bethesda, MD: Inst. Math. Stat. [Google Scholar]
  13. Bengtsson T, Snyder C, Nychka D. 2003. Toward a nonlinear ensemble filter for high-dimensional systems. J. Geophys. Res. 108:8775 [Google Scholar]
  14. Beskos A, Crisan D, Jasra A, Kamatani K, Zhou Y. 2017. A stable particle filter for a class of high-dimensional state-space models. Appl. Probab. 49:24–48 [Google Scholar]
  15. Briers M, Doucet A, Maskell S. 2010. Smoothing algorithms for state–space models. Ann. Inst. Stat. Math. 62:61–89 [Google Scholar]
  16. Bunch P, Godsill S. 2016. Approximations of the optimal importance density using Gaussian particle flow importance sampling. J. Am. Stat. Assoc. 111:748–62 [Google Scholar]
  17. Butala MD, Frazin RA, Chen Y, Kamalabadi F. 2009. Tomographic imaging of dynamic objects with the ensemble Kalman filter. IEEE Trans. Image Proc. 18:1573–87 [Google Scholar]
  18. Caflisch RE, Morokoff W, Owen A. 1997. Valuation of mortgage backed securities using Brownian bridges to reduce effective dimension. J. Comput. Finance 1:27–46 [Google Scholar]
  19. Calvet LE, Czellar V, Ronchetti E. 2015. Robust filtering. J. Am. Stat. Assoc. 110:1591–606 [Google Scholar]
  20. Cappé O, Moulines E, Rydén T. 2005. Inference in Hidden Markov Models Berlin: Springer
  21. Carpenter J, Clifford P, Fearnhead P. 1999. An improved particle filter for non-linear problems. IEEE Proc. Radar Sonar Navig. 146:2–7 [Google Scholar]
  22. Carvalho CM, Johannes MS, Lopes HF, Polson NG. 2010. Particle learning and smoothing. Stat. Sci. 25:88–106 [Google Scholar]
  23. Chatterjee S, Diaconis P. 2017. The sample size required in importance sampling. Ann. Appl. Probab. In press
  24. Chen R, Liu JS. 2000. Mixture Kalman filters. J. R. Stat. Soc. B 62:493–508 [Google Scholar]
  25. Chopin N. 2004. Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. Ann. Stat. 32:2385–411 [Google Scholar]
  26. Chopin N, Jacob PE, Papaspiliopoulos O. 2013. SMC2: an efficient algorithm for sequential analysis of state space models. J. R. Stat. Soc. B 75:397–426 [Google Scholar]
  27. Chopin N, Singh SS. 2015. On particle Gibbs sampling. Bernoulli 21:1855–83 [Google Scholar]
  28. Chustagulprom N, Reich S, Reinhardt M. 2016. A hybrid ensemble transform filter for nonlinear and spatially extended dynamical systems. SIAM/ASA J. Uncertain. Quantif. 4:592–608 [Google Scholar]
  29. Cox D. 1981. Statistical analysis of time series: some recent developments. Scand. J. Stat. 8:93–115 [Google Scholar]
  30. Crisan D. 2001. Particle filters—a theoretical perspective. Sequential Monte Carlo Methods in Practice A Doucet, N de Freitas, N Gordon 17–41 New York: Springer–Verlag [Google Scholar]
  31. Crisan D, Doucet A. 2002. A survey of convergence results on particle filtering methods for practitioners. IEEE Trans. Signal Proc. 50:736–46 [Google Scholar]
  32. Dahlin J, Lindsten F, Schön TB. 2015. Particle Metropolis-Hastings using gradient and Hessian information. Stat. Comput. 25:81–92 [Google Scholar]
  33. Del Moral P. 2004. Feynman-Kac formulae: genealogical and interacting particle systems with applications. New York: Springer
  34. Del Moral P, Doucet A, Jasra A. 2006. Sequential Monte Carlo samplers. J. R. Stat. Soc. B 68:411–36 [Google Scholar]
  35. Del Moral P, Kohn R, Patras F. 2016. On Feynman-Kac and particle Markov chain Monte Carlo models. Ann. Inst. Henri Poincaré Probab. Stat. 52:1687–733 [Google Scholar]
  36. Deligiannidis G, Doucet A, Pitt MK. 2015. The correlated pseudo-marginal method. arXiv1511.04992 [stat.CO]
  37. Douc R, Garivier A, Moulines E, Olsson J. 2011. Sequential Monte Carlo smoothing for general state space hidden Markov models. Ann. Appl. Probab. 21:2109–45 [Google Scholar]
  38. Douc R, Moulines E, Olsson J. 2014.a Long-term stability of sequential Monte Carlo methods under verifiable conditions. Ann. Appl. Probab. 24:1767–802 [Google Scholar]
  39. Douc R, Moulines E, Stoffer DS. 2014.b Nonlinear Time Series: Theory, Methods and Applications with R Examples Boca Raton, FL: Chapman and Hall/CRC
  40. Doucet A, Briers M, Sénécal S. 2006. Efficient block sampling strategies for sequential Monte Carlo Methods. J. Comput. Gr. Stat. 15:693–711 [Google Scholar]
  41. Doucet A, Godsill SJ, Andrieu C. 2000. On sequential Monte Carlo sampling methods for Bayesian filtering. Stat. Comput. 10:197–208 [Google Scholar]
  42. Doucet A, Johansen AM. 2011. A tutorial on particle filtering and smoothing: fifteen years later. The Oxford Handbook of Nonlinear Filtering D Crisan, Rozovski 656–704 Oxford, UK: Oxford Univ. Press [Google Scholar]
  43. Doucet A, Pitt MK, Deligiannidis G, Kohn R. 2015. Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator. Biometrika 102:295–313 [Google Scholar]
  44. Durbin J, Koopman SJ. 2001. Time Series Analysis by State Space Methods Oxford, UK: Clarendon
  45. Embrechts P, Klüppelberg C, Mikosch T. 1997. Modelling Extremal Events: For Insurance and Finance Berlin: Springer
  46. Evensen G. 1994. Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics. J. Geophys. Res. 99:10143–62 [Google Scholar]
  47. Evensen G. 2003. The ensemble Kalman filter: theoretical formulation and practical implementation. Ocean Dyn 53:343–67 [Google Scholar]
  48. Evensen G. 2007. Data Assimilation: The Ensemble Kalman Filter New York: Springer
  49. Fearnhead P. 1998. Sequential Monte Carlo methods in filter theory PhD Thesis, Oxford Univ. http://www.maths.lancs.ac.uk/∼fearnhea/thesis.ps
  50. Fearnhead P. 2002. MCMC, sufficient statistics and particle filters. J. Comput. Gr. Stat. 11:848–62 [Google Scholar]
  51. Fearnhead P. 2005. Using random quasi–Monte Carlo within particle filters, with application to financial time series. J. Comput. Gr. Stat. 14:751–69 [Google Scholar]
  52. Fearnhead P, Meligkotsidou L. 2016. Augmentation schemes for particle MCMC. Stat. Comput. 26:1293–306 [Google Scholar]
  53. Fearnhead P, Wyncoll D, Tawn J. 2010. A sequential smoothing algorithm with linear computational cost. Biometrika 97:447–64 [Google Scholar]
  54. Frei M. 2013. Ensemble Kalman filtering and generalizations PhD Thesis, ETH Zurich
  55. Frei M, Künsch HR. 2013. Bridging the ensemble Kalman and particle filter. Biometrika 100:781–800 [Google Scholar]
  56. Frühwirth-Schnatter S, Frühwirth R, Held L, Rue H. 2009. Improved auxiliary mixture sampling for hierarchical models of non-Gaussian data. Stat. Comput. 19:479–92 [Google Scholar]
  57. Fulop A, Li J. 2013. Efficient learning via simulation: a marginalized resample-move approach. J. Econ. 176:146–61 [Google Scholar]
  58. Gerber M, Chopin N. 2015. Sequential quasi Monte Carlo. J. R. Stat. Soc. B 77:509–79 [Google Scholar]
  59. Gilks WR, Berzuini C. 2001. Following a moving target—Monte Carlo inference for dynamic Bayesian models. J. R. Stat. Soc. B 63:127–46 [Google Scholar]
  60. Godsill SJ, Doucet A, West M. 2004. Monte Carlo smoothing for non-linear time series. J. Am. Stat. Assoc. 99:156–68 [Google Scholar]
  61. Gordon N, Salmond D, Smith AFM. 1993. Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEEE Proc. F 140:107–13 [Google Scholar]
  62. Hunt BR, Kostelich EJ, Szunyogh I. 2007. Efficient data assimilation for spatiotemporal chaos: a local ensemble transform Kalman filter. Physica D 230:112–26 [Google Scholar]
  63. Hürzeler M, Künsch HR. 1998. Monte Carlo approximations for general state-space models. J. Comput. Gr. Stat. 7:175–93 [Google Scholar]
  64. Jacob PE, Lindsten F, Schön TB. 2016. Coupling of particle filters. arXiv1606.01156 [stat.ME]
  65. Jacob PE, Murray LM, Rubenthaler S. 2015. Path storage in the particle filter. Stat. Comput. 25:487–96 [Google Scholar]
  66. Johansen AM. 2009. SMCTC: sequential Monte Carlo in C++. J. Stat. Softw. 30:1–41 [Google Scholar]
  67. Kantas N, Doucet A, Singh SS, Maciejowski J, Chopin N. 2015. On particle methods for parameter estimation in state-space models. Stat. Sci. 30:328–51 [Google Scholar]
  68. Kelly D, Majda AJ, Tong X. 2015. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence. PNAS 112:10589–94 [Google Scholar]
  69. Kim S, Shephard N, Chib S. 1998. Stochastic volatility: likelihood inference and comparison with ARCH models. Rev. Econ. Stud. 65:361–93 [Google Scholar]
  70. Kitagawa G. 1996. Monte Carlo filter and smoother for non-Gaussian nonlinear state space models. J. Comput. Gr. Stat. 5:1–25 [Google Scholar]
  71. Künsch HR. 2005. Recursive Monte Carlo filters: algorithms and theoretical analysis. Ann. Stat. 33:1983–2021 [Google Scholar]
  72. Künsch HR. 2013. Particle filters. Bernoulli 19:1391–403 [Google Scholar]
  73. Lauritzen S. 1996. Graphical Models Oxford, UK: Clarendon
  74. Le Gland F Monbet V, Tran V. 2011. Large sample asymptotics for the ensemble Kalman filter. The Oxford Handbook of Nonlinear Filtering D Crisan, B Rozovski 598–634 Oxford, UK: Oxford Univ. Press [Google Scholar]
  75. Lee A, Whiteley N. 2016. Forest resampling for distributed sequential Monte Carlo. Stat. Anal. Data Min. ASA Data Sci. J. 9:230–48 [Google Scholar]
  76. Lindsten F, Douc R, Moulines E. 2015. Uniform ergodicity of the particle Gibbs sampler. Scand. J. Stat. 42:775–97 [Google Scholar]
  77. Lindsten F, Jordan MI, Schön TB. 2014. Particle Gibbs with ancestor sampling. J. Mach. Learn. Res. 15:2145–84 [Google Scholar]
  78. Liu J, West M. 2001. Combined parameter and state estimation in simulation based filtering. Sequential Monte Carlo in Practice A Doucet, JFG de Freitas, NJ Gordon New York: Springer [Google Scholar]
  79. Lorenz EN. 1963. Deterministic non-periodic flows. J. Atmos. Sci. 20:130–41 [Google Scholar]
  80. Lorenz EN, Emanuel KA. 1998. Optimal sites for supplementary weather observations: simulations with a small model. J. Atmos. Sci. 55:399–414 [Google Scholar]
  81. Majda AJ, Harlim J. 2012. Filtering Complex Turbulent Systems Cambridge, UK: Cambridge Univ. Press
  82. Michaud N, de Valpine P, Turek D, Paciorek CJ. 2017. Sequential Monte Carlo methods in the nimble R package. arXiv1703.06206 [stat.CO]
  83. Murray I, Graham M. 2016. Pseudo-marginal slice sampling. Proc. 19th Int. Conf. Artif. Intell. Stat.911–19
  84. Murray LM. 2015. Bayesian state-space modelling on high-performance hardware using LibBi. J. Stat. Softw.67
  85. Nemeth C, Fearnhead P, Mihaylova L. 2016.a Particle approximations of the score and observed information matrix for parameter estimation in state-space models with linear computational cost. J. Comput. Gr. Stat. 25:1138–57 [Google Scholar]
  86. Nemeth C, Sherlock C, Fearnhead P. 2016.b Particle Metropolis adjusted Langevin algorithms. Biometrika 103:701–17 [Google Scholar]
  87. Nerger L, Hiller W. 2013. Software for ensemble-based data assimilation systems—implementation strategies and scalability. Comput. Geosci. 55:110–18 [Google Scholar]
  88. Niederreiter H. 1978. Quasi–Monte Carlo methods and pseudo-random numbers. Bull. Am. Math. Soc. 84:957–1041 [Google Scholar]
  89. Nielsen A, Berg CW. 2014. Estimation of time-varying selectivity in stock assessments using state space models. Fisheries Res 158:96–101 [Google Scholar]
  90. Olsson J, Westerborn J. 2017. Efficient particle-based online smoothing in general hidden Markov models: the PaRIS algorithm. Bernoulli 23:1951–96 [Google Scholar]
  91. Ott E, Hunt BR, Szunyogh I, Zimin AV, Kostelich E. et al. 2004. A local ensemble Kalman filter for atmospheric data assimilation. Tellus A 56:415–28 [Google Scholar]
  92. Owen AB. 1998. Monte Carlo extensions of quasi-Monte Carlo. WSC '98 Proc. 30th Conf. Winter Sim., Washington, DC, Dec. 13–16571–78 Los Alamitos, CA: IEEE Comput. Soc. [Google Scholar]
  93. Pitt MK, dos Santos Silva R, Giordani P, Kohn R. 2012. On some properties of Markov chain Monte Carlo simulation methods based on the particle filter. J. Econom. 171:134–51 [Google Scholar]
  94. Pitt MK, Shephard N. 1999. Filtering via simulation: auxiliary particle filters. J. Am. Stat. Assoc. 94:590–99 [Google Scholar]
  95. Polson NG, Stroud JR, Müller P. 2008. Practical filtering with sequential parameter learning. J. R. Stat. Soc. B 70:413–28 [Google Scholar]
  96. Poyiadjis G, Doucet A, Singh SS. 2011. Particle approximations of the score and observed information matrix in state space models with application to parameter estimation. Biometrika 98:65–80 [Google Scholar]
  97. Rebeschini P, van Handel R. 2015. Can local particle filters beat the curse of dimensionality. Ann. Appl. Probab. 25:2809–66 [Google Scholar]
  98. Reich S. 2013. A nonparametric ensemble transform method for Bayesian inference. SIAM J. Sci. Comput. 35:A2013–24 [Google Scholar]
  99. Reich S, Cotter C. 2015. Probabilistic Forecasting and Bayesian Data Assimilation Cambridge, UK: Cambridge Univ. Press
  100. Robert S. 2017. Ensemble Kalman particle filters for high-dimensional data assimilation PhD Thesis, ETH Zurich
  101. Robert S, Künsch HR. 2017. Localizing the ensemble Kalman particle filter. Tellus A 69:1 [Google Scholar]
  102. Sanz-Alonso D. 2016. Importance sampling and necessary sample size: an information theory approach. arXiv1608.08814 [stat.CO]
  103. Sen D, Thiery A, Jasra A. 2017. On coupling particle filter trajectories. Stat. Comput. https://doi.org/10.1007/s11222-017-9740-z [Crossref]
  104. Sherlock C, Thiery A, Lee A. 2017. Pseudo-marginal Metropolis–Hastings using averages of unbiased estimators. Biometrika 104:727–34 [Google Scholar]
  105. Sherlock C, Thiery AH, Roberts GO. 2015. On the efficiency of pseudo marginal random walk Metropolis algorithms. Ann. Stat. 43:238–75 [Google Scholar]
  106. Singh SS, Lindsten F, Moulines E. 2017. Blocking strategies and stability of particle Gibbs samplers. Biometrika 104:971–86 [Google Scholar]
  107. Stone LD, Streit RL, Corwin TL, Bell KL. 2014. Bayesian Multiple Target Tracking Boston: Artech House
  108. Storvik G. 2002. Particle filters for state-space models with the presence of unknown static parameters. IEEE Trans. Signal Proc. 50:281–89 [Google Scholar]
  109. Stroud JR, Stein ML, Lesht BM, Schwab DJ, Beletsky D. 2010. An ensemble Kalman filter and smoother for satellite data assimilation. J. Am. Stat. Assoc. 105:978–99 [Google Scholar]
  110. Tippett MK, Anderson JL, Bishop CH, Hamill TM, Whitaker JS. 2003. Ensemble square root filters. Mon. Weather Rev. 131:1485–90 [Google Scholar]
  111. Tong X, Majda AJ, Kelly D. 2016. Nonlinear stability of the ensemble Kalman filter with adaptive covariance inflation. Commun. Math. Sci. 14:1283–313 [Google Scholar]
  112. Van Leeuwen P. 2010. Nonlinear data assimilation in geosciences: an extremely efficient particle filter. Q. J. R. Meteorol. Soc. 136:1991–99 [Google Scholar]
  113. Vergé C, Dubarry C, Del Moral P, Moulines E. 2015. On parallel implementation of sequential Monte Carlo methods: the island particle model. Stat. Comput. 25:243–60 [Google Scholar]
  114. Whiteley N. 2010. Discussion on particle Markov chain Monte Carlo methods. J. R. Stat. Soc. B 72:306–7 [Google Scholar]
/content/journals/10.1146/annurev-statistics-031017-100232
Loading
/content/journals/10.1146/annurev-statistics-031017-100232
Loading

Data & Media loading...

Supplemental Material

Supplementary Data

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error