Sequential Monte Carlo methods—also known as particle filters—offer approximate solutions to filtering problems for nonlinear state-space systems. These filtering problems are notoriously difficult to solve in general due to a lack of closed-form expressions and challenging expectation integrals. The essential idea behind particle filters is to employ Monte Carlo integration techniques in order to ameliorate both of these challenges. This article presents an intuitive introduction to the main particle filter ideas and then unifies three commonly employed particle filtering algorithms. This unified approach relies on a nonstandard presentation of the particle filter, which has the advantage of highlighting precisely where the differences between these algorithms stem from. Some relevant extensions and successful application domains of the particle filter are also presented.


Article metrics loading...

Loading full text...

Full text loading...


Literature Cited

  1. 1.
    Thrun S, Burgard W, Fox D. 2005. Probabilistic Robotics Cambridge, MA: MIT Press
  2. 2.
    Duncan S, Gyöngy M. 2006. Using the EM algorithm to estimate the disease parameters for smallpox in 17th century London. 2006 IEEE International Conference on Control Applications3312–17. Piscataway, NJ: IEEE
    [Google Scholar]
  3. 3.
    Jazwinski AH. 1970. Stochastic Processes and Filtering Theory New York: Academic
  4. 4.
    Smith GL, Schmidt SF, McGee LA. 1962. Application of statistical filter theory to the optimal estimation of position and velocity on board a circumlunar vehicle Tech. Rep. TR R-135, Natl. Aeronaut. Space Adm. Washington, DC:
  5. 5.
    Julier SJ, Uhlmann JK. 2004. Unscented filtering and nonlinear estimation. Proc. IEEE 92:401–22
    [Google Scholar]
  6. 6.
    Gordon NJ, Salmond DJ, Smith AFM. 1993. Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proc. F 140:107–13
    [Google Scholar]
  7. 7.
    Kitagawa G. 1993. A Monte Carlo filtering and smoothing method for non-Gaussian nonlinear state space models. Proceedings of the 2nd US-Japan Joint Seminar on Statistical Time Series Analysis110–31. N.p.
    [Google Scholar]
  8. 8.
    Stewart L, McCarty P 1992. The use of Bayesian belief networks to fuse continuous and discrete information for target recognition and discrete information for target recognition, tracking, and situation assessment. Signal Processing, Sensor Fusion and Target Recognition V Libby, I Kadar 177–85. Proc. SPIE 1699 Bellingham, WA: SPIE
    [Google Scholar]
  9. 9.
    Doucet A, Johansen AM 2011. A tutorial on particle filtering and smoothing: fifteen years later. Nonlinear Filtering Handbook D Crisan, B Rozovsky 656–704. Oxford, UK: Oxford Univ. Press
    [Google Scholar]
  10. 10.
    Naesseth AC, Lindsten F, Schön TB. 2019. Elements of sequential Monte Carlo. Found. Trends Mach. Learn. 12:307–92
    [Google Scholar]
  11. 11.
    Cappé O, Godsill S, Moulines E. 2007. An overview of existing methods and recent advances in sequential Monte Carlo. Proc. IEEE 95:899–924
    [Google Scholar]
  12. 12.
    Chopin N, Papaspiliopoulos O. 2020. An Introduction to Sequential Monte Carlo Cham, Switz: Springer
  13. 13.
    Särkkä S. 2013. Bayesian Filtering and Smoothing Cambridge, UK: Cambridge Univ. Press
  14. 14.
    Douc R, Moulines E, Stoffer D. 2014. Nonlinear Time Series: Theory, Methods, and Applications with R Examples Boca Raton, FL: CRC
  15. 15.
    Cappé O, Moulines E, Rydén T. 2005. Inference in Hidden Markov Models Berlin: Springer
  16. 16.
    Del Moral P. 2004. Feynman-Kac Formulae: Genealogical and Interacting Particle Systems with Applications New York: Springer
  17. 17.
    Endo A, van Leeuwen E, Baguelin M. 2019. Introduction to particle Markov chain Monte Carlo for disease dynamics modellers. Epidemics 29:100363
    [Google Scholar]
  18. 18.
    Naesseth AC, Lindsten F, Schön TB. 2019. High-dimensional filtering using nested sequential Monte Carlo. IEEE Trans. Signal Process. 67:4177–88
    [Google Scholar]
  19. 19.
    Davey S, Gordon N, Holland I, Rutten M, Williams J. 2016. Bayesian Methods in the Search for MH370 Singapore: Springer
  20. 20.
    Gut A. 1995. An Intermediate Course in Probability New York: Springer
  21. 21.
    Gelman A, Carlin JB, Stern HS, Dunson DB, Vehtari A, Rubin DB. 2013. Bayesian Data Analysis Boca Raton, FL: CRC. , 3rd ed..
  22. 22.
    Perot A, Fabry C. 1899. On the application of interference phenomena to the solution of various problems of spectroscopy and metrology. Astrophys. J. 9:87
    [Google Scholar]
  23. 23.
    Abbott BP, Abbott R, Abbott TD, Abernathy MR, Acernese F et al. 2016. Observation of gravitational waves from a binary black hole merger. Phys. Rev. Lett. 116:061102
    [Google Scholar]
  24. 24.
    Fricke TT. 2011. Homodyne detection for laser-interferometric gravitational wave detectors PhD Thesis, La. State Univ. Baton Rouge:
  25. 25.
    Hariharan P. 2010. Basics of Interferometry Amsterdam: Elsevier
  26. 26.
    Snyder C, Bengtsson T, Bickel P, Anderson J 2008. Obstacles to high-dimensional particle filtering. Mon. Weather Rev. 136:4629–40
    [Google Scholar]
  27. 27.
    Rebeschini P, van Handel R. 2015. Can local particle filters beat the curse of dimensionality?. Ann. Appl. Probab. 25:2809–66
    [Google Scholar]
  28. 28.
    Kailath T, Sayed AH, Hassibi B. 2000. Linear Estimation Upper Saddle River, NJ: Prentice Hall
  29. 29.
    Doucet A, de Freitas N, Gordon N 2001. Sequential Monte Carlo Methods in Practice New York: Springer
  30. 30.
    Hol J, Schön TB, Gustafsson F. 2006. On resampling algorithms for particle filters. 2006 IEEE Nonlinear Statistical Signal Processing Workshop79–82. Piscataway, NJ: IEEE
    [Google Scholar]
  31. 31.
    Pitt MK, Shephard N. 1999. Filtering via simulation: auxiliary particle filters. J. Am. Stat. Assoc. 94:590–99
    [Google Scholar]
  32. 32.
    Andersson C, Ribeiro AH, Tiels K, Wahlström N, Schön TB. 2019. Deep convolutional networks in system identification. 2019 IEEE 58th Conference on Decision and Control3670–76. Piscataway, NJ: IEEE
    [Google Scholar]
  33. 33.
    Ljung L. 1999. System Identification: Theory for the User Upper Saddle River, NJ: Prentice Hall. , 2nd ed..
  34. 34.
    Schön TB, Lindsten F, Dahlin J, Wågberg J, Naesseth AC et al. 2015. Sequential Monte Carlo methods for system identification. IFAC-PapersOnLine 48:28775–86
    [Google Scholar]
  35. 35.
    Kantas N, Doucet A, Singh SS, Maciejowski JM, Chopin N. 2015. On particle methods for parameter estimation in state-space models. Stat. Sci. 30:328–51
    [Google Scholar]
  36. 36.
    Pitt MK, dos Santos Silva R, Giordani R, Kohn R. 2012. On some properties of Markov chain Monte Carlo simulation methods based on the particle filter. J. Econom. 171:134–51
    [Google Scholar]
  37. 37.
    Dempster A, Laird N, Rubin D 1977. Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. B 39:1–38
    [Google Scholar]
  38. 38.
    Olsson J, Douc R, Cappé O, Moulines E. 2008. Sequential Monte Carlo smoothing with application to parameter estimation in nonlinear state-space models. Bernoulli 14:155–79
    [Google Scholar]
  39. 39.
    Schön TB, Wills A, Ninness B. 2011. System identification of nonlinear state-space models. Automatica 47:39–49
    [Google Scholar]
  40. 40.
    Lindholm A, Lindsten F. 2019. Learning dynamical systems with particle stochastic approximation EM. arXiv:1806.09548 [stat.CO]
  41. 41.
    Bottou L, Curtis FE, Nocedal J. 2018. Optimization methods for large-scale machine learning. SIAM Rev 60:223–311
    [Google Scholar]
  42. 42.
    Wills AG, Schön TB. 2021. Stochastic quasi-Newton with line-search regularisation. Automatica 127:109503
    [Google Scholar]
  43. 43.
    Andrieu C, Doucet A, Holenstein R. 2010. Particle Markov chain Monte Carlo methods. J. R. Stat. Soc. B 72:269–342
    [Google Scholar]
  44. 44.
    Metropolis N, Rosenbluth AW, Rosenbluth MN, Teller AH, Teller E. 1953. Equations of state calculations by fast computing machine. J. Chem. Phys. 21:1087–92
    [Google Scholar]
  45. 45.
    Hastings WK. 1970. Monte Carlo simulation methods using Markov chains and their applications. Biometrica 57:97–109
    [Google Scholar]
  46. 46.
    Geman S, Geman D. 1984. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell. PAMI-6:721–41
    [Google Scholar]
  47. 47.
    Schön TB, Svensson A, Murray LM, Lindsten F. 2018. Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo. Mech. Syst. Signal Process. 104:866–83
    [Google Scholar]
  48. 48.
    Andrieu C, Roberts GO. 2009. The pseudo-marginal approach for efficient Monte Carlo computations. Ann. Stat. 37:697–725
    [Google Scholar]
  49. 49.
    Lindsten F, Jordan MI, Schön TB. 2014. Particle Gibbs with ancestor sampling. J. Mach. Learn. Res. 15:2145–84
    [Google Scholar]
  50. 50.
    Rauch HE, Tung F, Striebel CT. 1965. Maximum likelihood estimates of linear dynamic systems. AIAA J 3:1445–50
    [Google Scholar]
  51. 51.
    Doucet A, Godsill SJ, Andrieu C. 2000. On sequential Monte Carlo sampling methods for Bayesian filtering. Stat. Comput. 10:197–208
    [Google Scholar]
  52. 52.
    Lindsten F, Schön TB. 2013. Backward simulation methods for Monte Carlo statistical inference. Found. Trends Mach. Learn. 6:1–143
    [Google Scholar]
  53. 53.
    Svensson A, Schön TB, Kok M. 2015. Nonlinear state space smoothing using the conditional particle filter. IFAC-PapersOnLine 48:28975–80
    [Google Scholar]
  54. 54.
    Naesseth AC, Lindsten F, Schön TB 2014. Sequential Monte Carlo for graphical models. Advances in Neural Information Processing Systems 27 Z Ghahramani, M Welling, C Cortes, N Lawrence, KQ Weinberger 1862–70. Red Hook, NY: Curran
    [Google Scholar]
  55. 55.
    Murray LM, Schön TB. 2018. Automated learning with a probabilistic programming language: Birch. Annu. Rev. Control 46:29–43
    [Google Scholar]
  56. 56.
    Wood F, Meent JW, Mansinghka V 2014. A new approach to probabilistic programming inference. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics S Kaski, J Corander 1024–32. Proc. Mach. Learn. Res. 33 N.p.: PMLR
    [Google Scholar]
  57. 57.
    Blei D, Kucukelbir A, McAuliffe J. 2017. Variational inference: a review for statisticians. J. Am. Stat. Assoc. 112:859–77
    [Google Scholar]
  58. 58.
    Maddison CJ, Lawson J, Tucker G, Heess N, Norouzi M et al. 2017. Filtering variational objectives. Advances in Neural Information Processing Systems 30 I Guyon, U Von Luxburg, S Bengio, H Wallach, R Fergus et al.6574–84. Red Hook, NY: Curran
    [Google Scholar]
  59. 59.
    Naesseth AC, Linderman S, Ranganath R, Blei D 2018. Variational sequential Monte Carlo. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics A Storkey, F Perez-Cruz 968–77. Proc. Mach. Learn. Res. 84 N.p.: PMLR
    [Google Scholar]
  60. 60.
    Le TA, Igl M, Rainforth T, Jin T, Wood F 2018. Auto-encoding sequential Monte Carlo. Proceedings of the 2018 International Conference on Learning Representations La Jolla, CA: Int. Conf. Learn. Represent https://openreview.net/pdf?id=BJ8c3f-0b
    [Google Scholar]
  61. 61.
    Ionides EL, Bhadra A, Atchadé Y, King AA. 2011. Iterated filtering.. Ann. Stat. 39:1776–802
    [Google Scholar]
  62. 62.
    Guarniero P, Johansen AM, Lee A. 2016. The iterated auxiliary particle filter. J. Am. Stat. Assoc. 112:1636–47
    [Google Scholar]
  63. 63.
    Yildiz C, Heinonen M, Lahdesmaki H 2019. ODE2VAE: deep generative second order ODEs with Bayesian neural networks. Advances in Neural Information Processing Systems 32 H Wallach, H Larochelle, A Beygelzimer, F d'Alché-Buc, E Fox, R Garnett 13366–75. Red Hook, NY: Curran
    [Google Scholar]
  64. 64.
    Fraccaro M, Kamronn S, Paquet U, Winther O 2017. A disentangled recognition and nonlinear dynamics model for unsupervised learning. Advances in Neural Information Processing Systems 30 I Guyon, U Von Luxburg, S Bengio, H Wallach, R Fergus et al.3602–11. Red Hook, NY: Curran
    [Google Scholar]
  65. 65.
    Corenflos A, Thornton J, Deligiannidis G, Doucet A 2021. Differentiable particle filtering via entropy-regularized optimal transport. Proceedings of the 38th International Conference on Machine Learning M Meila, T Zhang 2100–11. Proc. Mach. Learn. Res. 139 N.p.: PMLR
    [Google Scholar]
  66. 66.
    Ersbo P. 2018. Displacement estimation for homodyne Michelson interferometers based on particle filtering MS Thesis, Uppsala Univ. Uppsala, Swed:.

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error