In this paper, I review the main results on the asymptotic properties of the posterior distribution in nonparametric or high-dimensional models. In particular, I explain how posterior concentration rates can be derived and what we learn from such analysis in terms of the impact of the prior distribution on high-dimensional models. These results concern fully Bayes and empirical Bayes procedures. I also describe some of the results that have been obtained recently in semiparametric models, focusing mainly on the Bernstein–von Mises property. Although these results are theoretical in nature, they shed light on some subtle behaviors of the prior models and sharpen our understanding of the family of functionals that can be well estimated for a given prior model.


Article metrics loading...

Loading full text...

Full text loading...


Literature Cited

  1. Arbel J, Gayraud G, Rousseau J. 2013. Bayesian adaptive optimal estimation using a sieve prior. Scand. J. Stat. 40:549–70 [Google Scholar]
  2. Banerjee S, Ghosal S. 2015. Bayesian structure learning in graphical models. J. Multivar. Anal. 136:147–62 [Google Scholar]
  3. Barron A. 1988. The exponential convergence of posterior probabilities with implications for Bayes estimators of density functions Tech. Rep. 7, Univ. Illinois, Urbana-Campaign, IL [Google Scholar]
  4. Barron A, Schervish M, Wasserman L. 1999. The consistency of posterior distributions in nonparametric problems. Ann. Stat. 27:536–61 [Google Scholar]
  5. Belitser E, Levit B. 2003. On the empirical Bayes approach to adaptive filtering in the Gaussian model. Math. Methods Stat. 12:131–54 [Google Scholar]
  6. Belitser E, Serra P, van Zanten JH. 2013. Estimating the period of a cyclic non-homogeneous Poisson process. Scand. J. Stat. 40:204–18 [Google Scholar]
  7. Bhattacharya A, Dunson DB. 2011. Sparse Bayesian infinite factor models. Biometrika 98:291–306 [Google Scholar]
  8. Bhattacharya A, Pati D, Dunson D. 2014. Anisotropic function estimation using multi-bandwidth Gaussian processes. Ann. Stat. 42:352–81 [Google Scholar]
  9. Bhattacharya A, Pati D, Pillai N, Dunson D. 2015. Dirichlet-Laplace priors for optimal shrinkage. J. Am. Stat. Assoc. 110:5121479–90 [Google Scholar]
  10. Bickel PJ, Kleijn BJK. 2012. The semiparametric Bernstein–von Mises theorem. Ann. Stat. 40:206–37 [Google Scholar]
  11. Birgé L. 1983. Approximation dans les espaces métriques et théorie de l'estimation. Probab. Theory Relat. Fields 65:181–237 [Google Scholar]
  12. Cai T, Low M. 2006. Adaptive confidence balls. Ann. Stat. 34:202–28 [Google Scholar]
  13. Canale A, de Blasi P. 2013. Posterior consistency of nonparametric location-scale mixtures of multivariate Gaussian density estimation. arXiv:1306.2671 [math.ST]
  14. Castillo I. 2008. Lower bounds for posterior rates with Gaussian process priors. Electron. J. Stat. 2:1281–99 [Google Scholar]
  15. Castillo I. 2010. A semiparametric Bernstein–von Mises theorem for Gaussian process priors. Probab. Theory Relat. Fields 152:53–99 [Google Scholar]
  16. Castillo I. 2012. Semiparametric Bernstein–von Mises theorem and bias, illustrated with Gaussian process priors. Sankhya A 74:2194–221 [Google Scholar]
  17. Castillo I, Nickl R. 2013. Nonparametric Bernstein–von Mises theorems in Gaussian white noise. Ann. Stat. 41:41999–2028 [Google Scholar]
  18. Castillo I, Rousseau J. 2015. A general Bernstein–von Mises theorem for smooth functionals in semi-parametric models. Ann. Stat. 43:2353–83 [Google Scholar]
  19. Castillo I, Schmidt-Hieber J, Van der Vaart A. 2015. Bayesian linear regression with sparse priors. Ann. Stat. 43:1986–2018 [Google Scholar]
  20. Castillo I, van der Vaart A. 2012. Needles and straw in a haystack: posterior concentration for possibly sparse sequences. Ann. Stat. 40:2069–101 [Google Scholar]
  21. Choi T, Schervish M. 2007. On posterior consistency in nonparametric regression problems. J. Multivar. Anal. 98:1969–87 [Google Scholar]
  22. Clyde MA, George EI. 2000. Flexible empirical Bayes estimation for wavelets. J. R. Stat. Soc. Ser. B 62:681–98 [Google Scholar]
  23. Cox D. 1993. An analysis of Bayesian inference for nonparametric regression. Ann. Stat. 21:903–23 [Google Scholar]
  24. Cui W, George EI. 2008. Empirical Bayes versus fully Bayes variable selection. J. Stat. Plann. Inference 138:888–900 [Google Scholar]
  25. de Finetti B. 1937. La prédiction: ses logiques, ses sources prédictives. Ann. Inst. Henri Poincaré 7:1–68 [Google Scholar]
  26. de Jonge R, van Zanten JH. 2010. Adaptive nonparametric Bayesian inference using location-scale mixture priors. Ann. Stat. 38:3300–20 [Google Scholar]
  27. Dey D, Möller P, Sinha D. 1998. Practical Nonparametric and Semiparametric Bayesian Statistics Lect. Notes Stat 133 New York: Springer [Google Scholar]
  28. Diaconis P, Freedman D. 1986. On the consistency of Bayes estimates. Ann. Stat. 14:1–26 [Google Scholar]
  29. Donnet S, Rivoirard V, Rousseau J, Scricciolo C. 2014a. Posterior concentration rates for counting processes with Aalen multiplicative intensities. arXiv:1407.6033v1 [stat.ME]
  30. Donnet S, Rivoirard V, Rousseau J, Scricciolo C. 2014b. Posterior concentration rates for empirical Bayes procedures, with applications to Dirichlet process mixtures. arXiv:1406.4406v1 [math.ST]
  31. Ferguson T. 1974. Prior distributions in spaces of probability measures. Ann. Stat. 2:615–29 [Google Scholar]
  32. Freedman D. 1999. On the Bernstein–von Mises theorem with infinite dimensional parameter. Ann. Stat. 27:1119–40 [Google Scholar]
  33. George EI, Foster DP. 2000. Calibration and empirical Bayes variable selection. Biometrika 87:731–47 [Google Scholar]
  34. Ghosal S. 2001. Convergence rates for density estimation with Bernstein polynomials. Ann. Stat. 29:51264–80 [Google Scholar]
  35. Ghosal S, Ghosh JK, van der Vaart A. 2000. Convergence rates of posterior distributions. Ann. Stat. 28:500–31 [Google Scholar]
  36. Ghosal S, Roy A. 2006. Posterior consistency of Gaussian process prior for nonparametric binary regression. Ann. Stat. 34:2413–29 [Google Scholar]
  37. Ghosal S, Tang Y. 2006. Bayesian consistency for Markov processes. Sankhya 68:227–39 [Google Scholar]
  38. Ghosal S, van der Vaart A. 2007a. Convergence rates of posterior distributions for non-i.i.d. observations. Ann. Stat. 35:1192–223 [Google Scholar]
  39. Ghosal S, van der Vaart A. 2007b. Posterior convergence rates of Dirichlet mixtures at smooth densities. Ann. Stat. 35:2697–723 [Google Scholar]
  40. Ghosh JK, Ramamoorthi RV. 2003. Bayesian Nonparametrics New York: Springer-Verlag [Google Scholar]
  41. Green P, Richardson S. 2001. Modelling heterogeneity with and without the Dirichlet process. Scand. J. Stat. 28:2355–75 [Google Scholar]
  42. Hjort NL, Holmes C, Möller P, Walker SG. 2010. Bayesian Nonparametrics Cambridge, UK: Cambridge Univ. Press [Google Scholar]
  43. Hoffman M, Rousseau J, Schmidt-Hieber J. 2013. On adaptive posterior concentration rates. Ann. Stat. 43:2259–95 [Google Scholar]
  44. Knapik B, Salomond J. 2015. A general approach to posterior contraction in nonparametric inverse problems. arXiv:1407.0335 [math.ST]
  45. Knapik BT, Szabó BT, van der Vaart AW, van Zanten JH. 2015. Bayes procedures for adaptive inference in inverse problems for the white noise model. Probab. Theory Relat. Fields. doi: 10.1007/s00440-015-0619-7 [Google Scholar]
  46. Kruijer W, Rousseau J, van der Vaart A. 2010. Adaptive Bayesian density estimation with location-scale mixtures. Electron. J. Stat. 4:1225–57 [Google Scholar]
  47. Kyung G, Casella G. 2010. Estimation in Dirichlet random effects models. Ann. Stat. 38:979–1009 [Google Scholar]
  48. Lavine M. 1992. Some aspects of Polya tree distributions for statistical modelling. Ann. Stat. 20:1222–35 [Google Scholar]
  49. Lijoi A, Prünster I. 2009. Models beyond the Dirichlet process Work. Pap. 129 Collegio Carlo Alberto http://www.carloalberto.org/assets/working-papers/no.129.pdf [Google Scholar]
  50. Lijoi A, Prünster I, Walker S. 2005. On consistency of nonparametric normal mixtures for Bayesian density estimation. J. Am. Stat. Assoc. 100:1292–96 [Google Scholar]
  51. Liu JS. 1996. Nonparametric hierarchical Bayes via sequential imputation. Ann. Stat. 24:911–30 [Google Scholar]
  52. Pati D, Bhattacharya A, Pillai N, Dunson D. 2014. Posterior contraction in sparse Bayesian factor models for massive covariance matrices. Ann. Stat. 42:1102–30 [Google Scholar]
  53. Petrone S, Rousseau J, Scricciolo C. 2014. Bayes and empirical Bayes: Do they merge?. Biometrika 101:285–302 [Google Scholar]
  54. Rasmussen CE, Williams CKI. 2006. Gaussian Processes for Machine Learning Cambridge, MA: MIT Press [Google Scholar]
  55. Ray K. 2013. Bayesian inverse problems with non-conjugate priors. Electron. J. Stat. 7:2516–49 [Google Scholar]
  56. Richardson S, Green P. 1997. On Bayesian analysis of mixtures with an unknown number of components (with discussion). J. R. Stat. Soc. Ser. B 59:731–92 [Google Scholar]
  57. Rivoirard V, Rousseau J. 2012a. On the Bernstein–von Mises theorem for linear functionals of the density. Ann. Stat. 40:1489–523 [Google Scholar]
  58. Rivoirard V, Rousseau J. 2012b. Posterior concentration rates for infinite dimensional exponential families. Bayesian Anal. 7:311–34 [Google Scholar]
  59. Robbins H. 1964. The empirical Bayes approach to statistical decision problems. Ann. Mathemat. Stat. 35:1–20 [Google Scholar]
  60. Rousseau J. 2010. Rates of convergence for the posterior distributions of mixtures of Betas and adaptive nonparametric estimation of the density. Ann. Stat. 38:146–80 [Google Scholar]
  61. Rousseau J, Chopin N, Liseo B. 2012. Bayesian nonparametric estimation of the spectral density of a long memory Gaussian process. Ann. Stat. 40:964–95 [Google Scholar]
  62. Rousseau J, Szabó BT. 2015. Asymptotic behaviour of the empirical Bayes posteriors associated to maximum marginal likelihood estimator. arXiv:1504.04814 [math.ST]
  63. Schwartz L. 1965. On Bayes procedures. Z. Warsch. Verw. Gebiete 4:10–26 [Google Scholar]
  64. Scott JG, Berger JO. 2010. Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem. Ann. Stat. 38:52587–619 [Google Scholar]
  65. Scricciolo C. 2014. Adaptive Bayesian density estimation in Lp-metrics with Pitman-Yor or normalized inverse-Gaussian process kernel mixtures. Bayesian Anal. 9:475–520 [Google Scholar]
  66. Sethuraman J. 1994. A constructive definition of Dirichlet priors. Stat. Sin. 4:639–50 [Google Scholar]
  67. Shen W, Tokdar S, Ghosal S. 2013. Adaptive Bayesian multivariate density estimation with Dirichlet mixtures. Biometrika 100:623–40 [Google Scholar]
  68. Szabó BT, van der Vaart AW, van Zanten JH. 2013. Empirical Bayes scaling of Gaussian priors in the white noise model. Electron. J. Stat. 7:991–1018 [Google Scholar]
  69. Szabó BT, van der Vaart AW, van Zanten JH. 2015. Frequentist coverage of adaptive nonparametric Bayesian credible sets. Ann. Stat. 43:1391–428 [Google Scholar]
  70. van de Wiel M, Leday G, Pardo L, Rue H, van der Vaart A, Van Wieringen W. 2013. Bayesian analysis of RNA sequencing data by estimating multiple shrinkage priors. Biostatistics 14:113–28 [Google Scholar]
  71. van der Vaart AW. 1998. Asymptotic Statistics Cambridge Ser. Stat. Probab. Math 3 Cambridge, UK: Cambridge Univ. Press [Google Scholar]
  72. van der Vaart AW, van Zanten JH. 2008a. Rates of contraction of posterior distributions based on Gaussian process priors. Ann. Stat. 36:31435–63 [Google Scholar]
  73. van der Vaart AW, van Zanten JH. 2008b. Reproducing kernel Hilbert spaces of Gaussian priors. IMS Collect. 3:200–22 [Google Scholar]
  74. van der Vaart AW, van Zanten JH. 2009. Adaptive Bayesian estimation using a Gaussian random field with inverse Gamma bandwidth. Ann. Stat. 37:2655–75 [Google Scholar]
  75. Vernet E. 2014. Posterior consistency for nonparametric hidden Markov models with finite state space. Electron. J. Stat. 9:717–52 [Google Scholar]
  76. Yau C, Papaspiliopoulos O, Roberts GO, Holmes C. 2011. Bayesian non-parametric hidden Markov models with applications in genomics. J. R. Stat. Soc. Ser. B 73:1–21 [Google Scholar]
  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error