1932

Abstract

The flexible modeling of an entire distribution as a function of covariates, known as distributional regression, has seen growing interest over the past decades in both the statistics and machine learning literature. This review outlines selected state-of-the-art statistical approaches to distributional regression, complemented with alternatives from machine learning. Topics covered include the similarities and differences between these approaches, extensions, properties and limitations, estimation procedures, and the availability of software. In view of the increasing complexity and availability of large-scale data, this review also discusses the scalability of traditional estimation methods, current trends, and open challenges. Illustrations are provided using data on childhood malnutrition in Nigeria and Australian electricity prices.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-statistics-040722-053607
2024-04-22
2024-06-16
Loading full text...

Full text loading...

/deliver/fulltext/statistics/11/1/annurev-statistics-040722-053607.html?itemId=/content/journals/10.1146/annurev-statistics-040722-053607&mimeType=html&fmt=ahah

Literature Cited

  1. An S, Jeon JJ. 2023.. Distributional learning of variational AutoEncoder: application to synthetic data generation. . arXiv:2302.11294 [stat.ML]
  2. Aragam B, Yang R. 2022.. Uniform consistency in nonparametric mixture models. . arXiv:2108.14003 [math.ST]
  3. Bartlett P, Freund Y, Lee WS, Schapire RE. 1998.. Boosting the margin: a new explanation for the effectiveness of voting methods. . Ann. Stat. 26:(5):165186
    [Crossref] [Google Scholar]
  4. Belloni A, Chernozhukov V. 2011.. 1-penalized quantile regression in high-dimensional sparse models. . Ann. Stat. 39:(1):82130
    [Crossref] [Google Scholar]
  5. Biau G, Cadre B. 2021.. Optimization by gradient boosting. . In Advances in Contemporary Statistics and Econometrics: Festschrift in Honor of Christine Thomas-Agnan, A Daouia, A Ruiz-Gazen, 2344. Switzerland:: Springer Intl. Publ.
    [Crossref] [Google Scholar]
  6. Bishop CM. 1994.. Mixture density networks. Tech. Rep. , Aston Univ., Birmingham, UK:. https://publications.aston.ac.uk/id/eprint/373/
    [Google Scholar]
  7. Blei DM, Kucukelbir A, McAuliffe JD. 2017.. Variational inference: a review for statisticians. . J. Am. Stat. Assoc. 112:(518):85977
    [Crossref] [Google Scholar]
  8. Bondell HD, Reich BJ, Wang H. 2010.. Noncrossing quantile regression curve estimation. . Biometrika 97:(4):82538
    [Crossref] [Google Scholar]
  9. Box GEP, Cox DR. 1964.. An analysis of transformations. . J. R. Stat. Soc. Ser. B 26:(2):21152
    [Crossref] [Google Scholar]
  10. Breiman L. 2001.. Random forests. . Mach. Learn. 45::532
    [Crossref] [Google Scholar]
  11. Breiman L, Friedman JH. 1985.. Estimating optimal transformations for multiple regression and correlation. . J. Am. Stat. Assoc. 80:(391):58098
    [Crossref] [Google Scholar]
  12. Briseño Sanchez G, Hohberg M, Groll A, Kneib T. 2020.. Flexible instrumental variable distributional regression. . J. R. Stat. Soc. Ser. A 183:(4):155374
    [Crossref] [Google Scholar]
  13. Carlan M, Kneib T, Klein N. 2023.. Bayesian conditional transformation models. . J. Am. Stat. Assoc. https://doi.org/10.1080/01621459.2023.2191820
    [Google Scholar]
  14. Carroll R, Ruppert D. 1988.. Transformation and Weighting in Regression. New York:: CRC
    [Google Scholar]
  15. Carvalho CM, Polson NG. 2010.. The horseshoe estimator for sparse signals. . Biometrika 97::46580
    [Crossref] [Google Scholar]
  16. Cévid D, Michel L, Näf J, Bühlmann P, Meinshausen N. 2022.. Distributional random forests: heterogeneity adjustment and multivariate distributional regression. . J. Mach. Learn. Res. 23:(333):179
    [Google Scholar]
  17. Chernozhukov V, Fernández-Val I, Melly B. 2013.. Inference on counterfactual distributions. . Econometrica 81:(6):220568
    [Crossref] [Google Scholar]
  18. Chipman HA, George EI, McCulloch RE. 2010.. BART: Bayesian additive regression trees. . Ann. Appl. Stat. 4:(1):26698
    [Crossref] [Google Scholar]
  19. Cole TJ. 1988.. Fitting smoothed centile curves to reference data (with discussion). . J. R. Stat. Soc. Ser. A 151:(3):385418
    [Crossref] [Google Scholar]
  20. Craiu VR, Sabeti A. 2012.. In mixed company: Bayesian inference for bivariate conditional copula models with discrete and continuous outcomes. . J. Multivariate Anal. 110::10620
    [Crossref] [Google Scholar]
  21. Curtis S. 2017.. The intersection of probabilistic modeling & deep learning: interview with Silvia Chiappa, Google DeepMind. . LinkedIn Pulse Blog, May 17. https://www.linkedin.com/pulse/intersection-probabilistic-modeling-deep-learning-interview-curtis
    [Google Scholar]
  22. Dawid AP. 2007.. The geometry of proper scoring rules. . Ann. Inst. Stat. Math. 59::7793
    [Crossref] [Google Scholar]
  23. de Carvalho V, Rodríguez-Álvarez M, Klein N. 2019.. Density regression via penalised splines dependent Dirichlet process mixture of normal models. . In Proceedings of the 34th International Workshop on Statistical Modelling 1, ed. LM Machado, GDCC Soutinho , pp. 18488 Guimarães, Port:.: Stat. Model. Soc
    [Google Scholar]
  24. Delgado MA, García-Suaza A, Sant'Anna PHC. 2022.. Distribution regression in duration analysis: an application to unemployment spells. . Econom. J. 25:(3):67598
    [Crossref] [Google Scholar]
  25. Dunson DB. 2007.. Empirical Bayes density regression. . Stat. Sin. 17:(2):481504
    [Google Scholar]
  26. Dunson DB, Pillai N, Park JH. 2007.. Bayesian density regression. . J. R. Stat. Soc. Ser. B 69:(2):16383
    [Crossref] [Google Scholar]
  27. Efron B. 1986.. Double exponential families and their use in generalized linear regression. . J. Am. Stat. Assoc. 81:(395):70921
    [Crossref] [Google Scholar]
  28. Escobar MD, West M. 1995.. Bayesian density estimation and inference using mixtures. . J. Am. Stat. Assoc. 90:(430):57788
    [Crossref] [Google Scholar]
  29. Fasiolo M, Wood SN, Zaffran M, Nedellec R, Goude Y. 2021.. Fast calibrated additive quantile regression. . J. Am. Stat. Assoc. 116:(535):140212
    [Crossref] [Google Scholar]
  30. Fenske N, Kneib T, Hothorn T. 2011.. Identifying risk factors for severe childhood malnutrition by boosting additive quantile regression. . J. Am. Stat. Assoc. 106:(494):494510
    [Crossref] [Google Scholar]
  31. Filippou P, Kneib T, Marra G, Radice R. 2019.. A trivariate additive regression model with arbitrary link functions and varying correlation matrix. . J. Stat. Plan. Inference 199:(2):23648
    [Crossref] [Google Scholar]
  32. Firpo S, Fortin NM, Lemieux T. 2009.. Unconditional quantile regressions. . Econometrica 77:(3):95373
    [Crossref] [Google Scholar]
  33. Foresi A, Peracchi F. 1995.. The conditional distribution of excess returns: an empirical analysis. . J. Am. Stat. Assoc. 90::45166
    [Crossref] [Google Scholar]
  34. Friedrich S, Antes G, Behr S, Brannath W, Dumpert F, et al. 2022.. Is there a role for statistics in artificial intelligence?. Adv. Data Anal. Classif. 16:(4):82346
    [Crossref] [Google Scholar]
  35. Frühwirth-Schnatter S. 2006.. Finite Mixture and Markov Switching Models. New York:: Springer
    [Google Scholar]
  36. Frühwirth-Schnatter S, Celeux G, Robert CP. 2019.. Handbook of Mixture Analysis. Boca Raton, FL:: Chapman & Hall/CRC
    [Google Scholar]
  37. Gelfand AE, Kottas A, MacEachern SN. 2005.. Bayesian nonparametric spatial modeling with Dirichlet process mixing. . J. Am. Stat. Assoc. 100:(471):102135
    [Crossref] [Google Scholar]
  38. Gijbels I, Prosdocimi I, Claeskens G. 2010.. Nonparametric estimation of mean and dispersion functions in extended generalized linear models. . TEST 19:(1):580608
    [Crossref] [Google Scholar]
  39. Gioia V, Fasiolo M, Browell J, Bellio R. 2022.. Additive covariance matrix models: modelling regional electricity net-demand in Great Britain. . arXiv:2211.07451 [stat.AP]
  40. Gneiting T, Balabdaoui F, Raftery AE. 2007.. Probabilistic forecasts, calibration and sharpness. . J. R. Stat. Soc. Ser. B 69:(2):24368
    [Crossref] [Google Scholar]
  41. Gneiting T, Katzfuss M. 2014.. Probabilistic forecasting. . Annu. Rev. Stat. Appl. 1::12551
    [Crossref] [Google Scholar]
  42. Gneiting T, Raftery AE. 2007.. Strictly proper scoring rules, prediction, and estimation. . J. Am. Stat. Assoc. 102:(477):35978
    [Crossref] [Google Scholar]
  43. Gneiting T, Raftery AE, Westveld AH, Goldman T. 2005.. Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. . Mon. Weather Rev. 133:(5):1098118
    [Crossref] [Google Scholar]
  44. Griffin JE, Steel MFJ. 2006.. Order-based dependent Dirichlet processes. . J. Am. Stat. Assoc. 101:(473):17994
    [Crossref] [Google Scholar]
  45. Grün B, Leisch F. 2008.. Finite mixtures of generalized linear regression models. . In Recent Advances in Linear Models and Related Areas: Essays in Honour of Helge Toutenburg, ed. LJ Shalabh, C Heumann , pp. 20530 Heidelberg:: Physica-Verlag
    [Google Scholar]
  46. Hall P, Wolff RCL, Yao Q. 1999.. Methods for estimating a conditional distribution function. . J. Am. Stat. Assoc. 94:(445):15463
    [Crossref] [Google Scholar]
  47. Hans N, Klein N, Faschingbauer F, Schneider M, Mayr A. 2023.. Boosting distributional copula regression. . Biometrics 79:(3):2298310
    [Crossref] [Google Scholar]
  48. Hastie TJ, Tibshirani R. 1990.. Generalized Additive Models. . New York/Boca Raton:: Chapman & Hall/CRC
    [Google Scholar]
  49. He X. 1997.. Quantile curves without crossing. . Am. Stat. 51:(2):18692
    [Crossref] [Google Scholar]
  50. Henzi A, Ziegel JF, Gneiting T. 2021.. Isotonic distributional regression. . J. R. Stat. Soc. Ser. B 83::96393
    [Crossref] [Google Scholar]
  51. Hill J, Linero A, Murray J. 2020.. Bayesian additive regression trees: a review and look forward. . Annu. Rev. Stat. Appl. 7::25178
    [Crossref] [Google Scholar]
  52. Hothorn T. 2018.. Top-down transformation choice. . Stat. Model. 18:(3–4):27498
    [Crossref] [Google Scholar]
  53. Hothorn T, Kneib T, Bühlmann P. 2014.. Conditional transformation models. . J. R. Stat. Soc. Ser. B 76:(1):327
    [Crossref] [Google Scholar]
  54. Hothorn T, Möst L, Bühlmann P. 2018.. Most likely transformations. Scand. . J. Stat. 45:(1):11034
    [Google Scholar]
  55. Hyndman RJ, Bashtannyk DM, Grunwald GK. 1996.. Estimating and visualizing conditional densities. . J. Comput. Graph. Stat. 5:(4):31536
    [Crossref] [Google Scholar]
  56. Iorio MD, Müller P, Rosner GL, MacEachern SN. 2004.. An ANOVA model for dependent random measures. . J. Am. Stat. Assoc. 99:(465):20515
    [Crossref] [Google Scholar]
  57. Jordan MI, Jacobs RA. 1994.. Hierarchical mixtures of experts and the EM algorithm. . Neural Comput. 6:(2):181214
    [Crossref] [Google Scholar]
  58. Kauermann G, Krivobokova T, Fahrmeir L. 2009.. Some asymptotic results on generalized penalized spline smoothing. . J. R. Stat. Soc. Ser. B 71:(2):487503
    [Crossref] [Google Scholar]
  59. Kendall A, Gal Y. 2017.. What uncertainties do we need in Bayesian deep learning for computer vision?. In Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS'17), ed. U von Luxburg, I Guyon, S Bengio, H Wallach, R Fergus , pp. 558090 Red Hook, NY:: Curran
    [Google Scholar]
  60. Kingma DP, Welling M. 2014.. Auto-encoding variational Bayes. . In 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, April 14–16, 2014, Conference Track Proceedings. N.p.:: ICLR
    [Google Scholar]
  61. Klein N, Carlan M, Kneib T, Lang S, Wagner H. 2021.. Bayesian effect selection in structured additive distributional regression models. . Bayesian Anal. 16:(2):54573
    [Crossref] [Google Scholar]
  62. Klein N, Hothorn T, Barbanti L, Kneib T. 2022.. Multivariate conditional transformation models. Scand. . J. Stat. 49:(1):11642
    [Google Scholar]
  63. Klein N, Kneib T, Klasen S, Lang S. 2015a.. Bayesian structured additive distributional regression for multivariate responses. . J. R. Stat. Soc. Ser. C 64:(4):56991
    [Crossref] [Google Scholar]
  64. Klein N, Kneib T, Lang S. 2015b.. Bayesian generalized additive models for location, scale, and shape for zero-inflated and overdispersed count data. . J. Am. Stat. Assoc. 110:(509):40519
    [Crossref] [Google Scholar]
  65. Klein N, Smith MS. 2019.. Implicit copulas from Bayesian regularized regression smoothers. . Bayesian Anal. 14:(4):114371
    [Crossref] [Google Scholar]
  66. Klein N, Smith MS. 2021.. Bayesian variable selection for non-Gaussian responses: a marginally-calibrated copula approach. . Biometrics 77:(3):80923
    [Crossref] [Google Scholar]
  67. Klein N, Smith MS, Nott DJ. 2023.. Deep distributional time series models and the probabilistic forecasting of intraday electricity prices. . J. Appl. Econom. 38:(4):493511
    [Crossref] [Google Scholar]
  68. Kleinemeier J, Klein N. 2023.. Scalable estimation for structured additive distributional regression through variational inference. Work. Pap., Dep. Stat. , Tech. Univ. Dortumund, Dortmund, Ger.:
    [Google Scholar]
  69. Kock L, Klein N. 2023.. Truly multivariate structured additive distributional regression. . arXiv:2306.02711 [stat.ME]
  70. Koenker R. 2005.. Quantile Regression. Cambridge, UK:: Cambridge Univ. Press
    [Google Scholar]
  71. Koenker R, Bassett G. 1978.. Regression quantiles. . Econometrica 46:(1):3350
    [Crossref] [Google Scholar]
  72. Koenker R, Chernozhukov V, He X, Peng L 2017.. Handbook of Quantile Regression. New York:: CRC
    [Google Scholar]
  73. Kook L, Sick B, Bühlmann P. 2022.. Distributional anchor regression. . Stat. Comput. 32::39
    [Crossref] [Google Scholar]
  74. Kozumi H, Kobayashi G. 2011.. Gibbs sampling methods for Bayesian quantile regression. . J. Stat. Comput. Simul. 81:(11):156578
    [Crossref] [Google Scholar]
  75. Krämer N, Brechmann EC, Silvestrini D, Czado C. 2013.. Total loss estimation using copula-based regression models. . Insur. Math. Econ. 53:(3):82939
    [Crossref] [Google Scholar]
  76. Kucukelbir A, Ranganath R, Gelman A, Blei D. 2015.. Automatic variational inference in Stan. . In Proceedings of the 28th International Conference on Neural Information Processing Systems (NIPS '15), ed. C Cortes, DD Lee, M Sugiyama, R Garnett , pp. 56876 Cambridge, MA:: MIT Press
    [Google Scholar]
  77. Lamarche C, Parker T. 2023.. Wild bootstrap inference for penalized quantile regression for longitudinal data. . J. Econom. 235:(2):1799826
    [Crossref] [Google Scholar]
  78. Lang S, Umlauf N, Wechselberger P, Harttgen K, Kneib T. 2014.. Multilevel structured additive regression. . Stat. Comput. 24:(2):22338
    [Crossref] [Google Scholar]
  79. Lee Y, Nelder JA. 2006.. Double hierarchical generalized linear models (with discussion). . J. R. Stat. Soc. Ser. C 55:(2):13985
    [Crossref] [Google Scholar]
  80. Li Z, Wood S. 2020.. Faster model matrix crossproducts for large generalized linear models with discretized covariates. . Stat. Comput. 30::1925
    [Crossref] [Google Scholar]
  81. MacEachern SN, Müller P. 1998.. Estimating mixture of Dirichlet process models. . J. Comput. Graph. Stat. 7:(2):22338
    [Crossref] [Google Scholar]
  82. Manole T, Khalili A. 2021.. Estimating the number of components in finite mixture models via the Group-Sort-Fuse procedure. . Ann. Stat. 49:(6):304369
    [Crossref] [Google Scholar]
  83. Marra G, Radice R. 2013.. A penalized likelihood estimation approach to semiparametric sample selection binary response modeling. . Electron. J. Stat. 7::143255
    [Crossref] [Google Scholar]
  84. Mayr A, Fenske N, Hofner B, Kneib T, Schmid M. 2012.. Generalized additive models for location, scale and shape for high dimensional data: a flexible approach based on boosting. . J. R. Stat. Soc. Ser. C 61:(3):40327
    [Crossref] [Google Scholar]
  85. Meinshausen N. 2006.. Quantile regression forests. . J. Mach. Learn. Res. 7:(35):98399
    [Google Scholar]
  86. Muschinski T, Mayr GJ, Simon T, Umlauf N, Zeileis A. 2022.. Cholesky-based multivariate Gaussian regression. . Econom. Stat. In press
    [Google Scholar]
  87. Nelsen R. 2006.. An Introduction to Copulas. New York:: Springer
    [Google Scholar]
  88. Newcomb S. 1886.. A generalized theory of the combination of observations so as to obtain the best result. . Am. J. Math. 8:(4):34366
    [Crossref] [Google Scholar]
  89. Newey WK, Powell JL. 1987.. Asymmetric least squares estimation and testing. . Econometrica 55:(4):81947
    [Crossref] [Google Scholar]
  90. Noh H, Ghouch AE, Bouezmarni T. 2013.. Copula-based regression estimation and inference. . J. Am. Stat. Assoc. 108:(502):67688
    [Crossref] [Google Scholar]
  91. Ong VM, Nott D, Smith M. 2018.. Gaussian variational approximation with a factor covariance structure. . J. Comput. Graph. Stat. 27:(3):46578
    [Crossref] [Google Scholar]
  92. Orlandi V, Murray J, Linero A, Volfovsky A. 2021.. Density regression with Bayesian additive regression trees. . arXiv:2112.12259 [stat.ME]
  93. Ormerod JT, Wand MP. 2010.. Explaining variational approximations. . Am. Stat. 64:(2):14053
    [Crossref] [Google Scholar]
  94. Park J, Shalit U, Schölkopf B, Muandet K. 2021.. Conditional distributional treatment effect with kernel conditional mean embeddings and U-statistic regression. . Proc. Mach. Learn. Res. 139::840112
    [Google Scholar]
  95. Pitt M, Chan D, Kohn R. 2006.. Efficient Bayesian inference for Gaussian copula regression models. . Biometrika 93::53754
    [Crossref] [Google Scholar]
  96. Quintana FA, Mueller P, Jara A, MacEachern SN. 2020.. The dependent Dirichlet process and related models. . arXiv:2007.06129 [stat.ME]
  97. Rigby RA, Stasinopoulos DM. 2005.. Generalized additive models for location, scale and shape. . J. R. Stat. Soc. Ser. C 54:(3):50754
    [Crossref] [Google Scholar]
  98. Rodrigues T, Fan Y. 2017.. Regression adjustment for noncrossing Bayesian quantile regression. . J. Comput. Graph. Stat. 26:(2):27584
    [Crossref] [Google Scholar]
  99. Rossi GD, Harvey A. 2009.. Quantiles, expectiles and splines. . J. Econom. 152:(2):17985
    [Crossref] [Google Scholar]
  100. Rothe C, Wied D. 2013.. Misspecification testing in a class of conditional distributional models. . J. Am. Stat. Assoc. 108:(501):31424
    [Crossref] [Google Scholar]
  101. Rügamer D, Kolb C, Klein N. 2023.. Semi-structured distributional regression. . Am. Stat. In press
    [Google Scholar]
  102. Serfling R. 2002.. Quantile functions for multivariate analysis: approaches and applications. . Stat. Neerl. 56::21432
    [Crossref] [Google Scholar]
  103. Sisson S, Fan Y, Beaumont M 2018.. Handbook of Approximate Bayesian Computation. Boca Raton, FL:: Chapman & Hall/CRC
    [Google Scholar]
  104. Smith MS. 2023.. Implicit copulas: An overview. . Econom. Stat. 28::81104
    [Google Scholar]
  105. Smith MS, Klein N. 2021.. Bayesian inference for regression copulas. . J. Bus. Econ. Stat. 39:(3):71228
    [Crossref] [Google Scholar]
  106. Song H, Diethe T, Kull M, Flach P. 2019.. Distribution calibration for regression. . Proc. Mach. Learn. Res. 97::5897906
    [Google Scholar]
  107. Song PXK, Li M, Yuan Y. 2009.. Joint regression analysis of correlated data using Gaussian copulas. . Biometrics 65:(1):6068
    [Crossref] [Google Scholar]
  108. Sriram K, Ramamoorthi R, Ghosh P. 2013.. Posterior consistency of Bayesian quantile regression based on the misspecified asymmetric Laplace density. . Bayesian Anal. 8:(2):479504
    [Crossref] [Google Scholar]
  109. Stadlmann S, Kneib T. 2022.. Interactively visualizing distributional regression models with distreg.vis. . Stat. Model. 22:(6):52745
    [Crossref] [Google Scholar]
  110. Stone CJ. 1977.. Consistent nonparametric regression. . Ann. Stat. 5:(4):595620
    [Crossref] [Google Scholar]
  111. Takeuchi I, Le QV, Sears TD, Smola AJ. 2006.. Nonparametric quantile estimation. . J. Mach. Learn. Res. 7:(45):123164
    [Google Scholar]
  112. Umlauf N, Seiler J, Wetscher M, Simon T, Lang S, Klein N. 2023.. Scalable estimation for structured additive distributional regression. . arXiv:2301.05593 [stat.CO]
  113. UNICEF. 1998.. The State of the World's Children 1998: Focus on Nutrition. Oxford:: Oxford Univ. Press
    [Google Scholar]
  114. Vatter T, Chavez-Demoulin V. 2015.. Generalized additive models for conditional dependence structures. . J. Multivariate Anal. 141::14767
    [Crossref] [Google Scholar]
  115. Velthoen J, Dombry C, Cai JJ, Engelke S. 2023.. Gradient boosting for extreme quantile regression. . Extremes 26::63967
    [Crossref] [Google Scholar]
  116. Villani M, Kohn R, Nott DJ. 2012.. Generalized smooth finite mixtures. . J. Econom. 171:(2):12133
    [Crossref] [Google Scholar]
  117. Waldmann E. 2018.. Quantile regression: a short story on how and why. . Stat. Model. 18:(3–4):20318
    [Crossref] [Google Scholar]
  118. Waltrup LS, Sobotka F, Kneib T, Kauermann G. 2015.. Expectile and quantile regression—David and Goliath?. Stat. Model. 15:(5):43356
    [Crossref] [Google Scholar]
  119. Wang L. 2013.. Consistency of posterior distributions for heteroscedastic nonparametric regression models. . Commun. Stat. Theory Methods 42:(15):273140
    [Crossref] [Google Scholar]
  120. Wilson AG, Ghahramani Z. 2010.. Copula processes. . In NIPS'10: Proceedings of the 23rd International Conference on Neural Information Processing Systems 2, ed. JD Lafferty, CKI Williams, J Shawe-Taylor, RS Zemel, A Culotta pp. 246068 Red Hook, NY:: Curran
    [Google Scholar]
  121. Wood SN, Li Z, Shaddick G, Augustin NH. 2017.. Generalized additive models for gigadata: modeling the U.K. Black Smoke Network daily data. . J. Am. Stat. Assoc. 112:(519):1199210
    [Crossref] [Google Scholar]
  122. Yang Y, Wang HJ, He X. 2016.. Posterior inference in Bayesian quantile regression with asymmetric Laplace likelihood. . Int. Stat. Rev. 84:(3):32744
    [Crossref] [Google Scholar]
  123. Yau P, Kohn R. 2003.. Estimation and variable selection in nonparametric heteroscedastic regression. . Stat. Comput. 13:(3):191208
    [Crossref] [Google Scholar]
  124. Yee T. 2015.. Vector Generalized Linear and Additive Models. New York:: Springer
    [Google Scholar]
  125. Yu K, Moyeed RA. 2001.. Bayesian quantile regression. . Stat. Probab. Lett. 54::43747
    [Crossref] [Google Scholar]
  126. Yue Y, Rue H. 2011.. Bayesian inference for additive mixed quantile regression models. . Comput. Stat. Data Anal. 55::8496
    [Crossref] [Google Scholar]
/content/journals/10.1146/annurev-statistics-040722-053607
Loading
/content/journals/10.1146/annurev-statistics-040722-053607
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error