1932

Abstract

Population health interventions are essential to reduce health inequalities and tackle other public health priorities, but they are not always amenable to experimental manipulation. Natural experiment (NE) approaches are attracting growing interest as a way of providing evidence in such circumstances. One key challenge in evaluating NEs is selective exposure to the intervention. Studies should be based on a clear theoretical understanding of the processes that determine exposure. Even if the observed effects are large and rapidly follow implementation, confidence in attributing these effects to the intervention can be improved by carefully considering alternative explanations. Causal inference can be strengthened by including additional design features alongside the principal method of effect estimation. NE studies often rely on existing (including routinely collected) data. Investment in such data sources and the infrastructure for linking exposure and outcome data is essential if the potential for such studies to inform decision making is to be realized.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-publhealth-031816-044327
2017-03-20
2024-12-05
Loading full text...

Full text loading...

/deliver/fulltext/publhealth/38/1/annurev-publhealth-031816-044327.html?itemId=/content/journals/10.1146/annurev-publhealth-031816-044327&mimeType=html&fmt=ahah

Literature Cited

  1. Abadie A, Diamond A, Hainmueller J. 1.  2010. Synthetic control methods for comparative case studies: estimating the effect of California's Tobacco Control Program. J. Am. Stat. Assoc. 105:493–505 [Google Scholar]
  2. Abadie A, Diamond A, Hainmueller J. 2.  2011. Synth: an R package for synthetic control methods in comparative case studies. J. Stat. Softw. 42:1–17 [Google Scholar]
  3. Abadie A, Diamond A, Hainmueller J. 3.  2015. Comparative politics and the synthetic control method. Am. J. Polit. Sci. 50:495–510 [Google Scholar]
  4. Abadie A, Gardeazabal J. 4.  2003. The economic costs of conflict: a case study of the Basque Country. Am. Econ. Rev. 93:113–32 [Google Scholar]
  5. 5. Acad. Med. Sci. 2007. Identifying the Environmental Causes of Disease: How Should We Decide What to Believe and When to Take Action. London: Acad. Med. Sci. [Google Scholar]
  6. Andalon M. 6.  2011. Impact of Oportunidades in overweight and obesity in Mexico. Health Econ 20:Suppl. 11–18 [Google Scholar]
  7. Austin PC. 7.  2011. An introduction to propensity score methods for reducing the effects of confounding in observational studies. Multivar. Behav. Res. 46:399–424 [Google Scholar]
  8. Basu S, Rehkopf DH, Siddiqi A, Glymour MM, Kawachi I. 8.  2016. Health behaviors, mental health, and health care utilization among single mothers after welfare reforms in the 1990s. Am. J. Epidemiol. 83:531–38 [Google Scholar]
  9. Bauhoff S. 9.  2014. The effect of school district nutrition policies on dietary intake and overweight: a synthetic control approach. Econ. Hum. Biol. 12:45–55 [Google Scholar]
  10. Bonell C, Fletcher A, Morton M, Lorenc T, Moore L. 10.  2012. Realist randomised controlled trials: a new approach to evaluating complex public health interventions. Soc. Sci. Med. 75:2299–306 [Google Scholar]
  11. Bor J, Moscoe E, Mutevedzi P, Newell M-L, Barnighausen T. 11.  2014. Regression discontinuity designs in epidemiology: causal inference without randomized trials. Epidemiology 25:729–37 [Google Scholar]
  12. Boyd J, Ferrante AM, O'Keefe C, Bass AJ, Randall AM. 12.  et al. 2012. Data linkage infrastructure for cross-jurisdictional health-related research in Australia. BMC Health Serv. Res. 2:480 [Google Scholar]
  13. Brown J, Neary J, Katikireddi SV, Thomson H, McQuaid RW. 13.  et al. 2015. Protocol for a mixed-methods longitudinal study to identify factors influencing return to work in the over 50s participating in the UK Work Programme: Supporting Older People into Employment (SOPIE). BMJ Open 5:e010525 [Google Scholar]
  14. Chattopadhyay R, Duflo E. 14.  2004. Women as policy makers: evidence from a randomised policy experiment in India. Econometrica 72:1409–43 [Google Scholar]
  15. 15. Comm. Soc. Determinants Health. 2008. Closing the Gap in a Generation: Health Equity Through Action on the Social Determinants of Health. Final Report of the Commission on Social Determinants of Health. Geneva: World Health Organ. [Google Scholar]
  16. Craig P, Cooper C, Gunnell D, Macintyre S, Petticrew M. 16.  et al. 2012. Using natural experiments to evaluate population health interventions: new Medical Research Council guidance. J. Epidemiol. Community Health 66:1182–86 [Google Scholar]
  17. Crifasi CK, Meyers JS, Vernick JS, Webster DW. 17.  2015. Effects of changes in permit-to-purchase handgun laws in Connecticut and Missouri on suicide rates. Prev. Med. 79:43–49 [Google Scholar]
  18. D'Agostino RB. 18.  1998. Tutorial in biostatistics. Propensity score methods for bias reduction in the comparison of a treatment to a non-randomized control group. Stat. Med. 17:2265–81 [Google Scholar]
  19. De Angelo G, Hansen B. 19.  2014. Life and death in the fast lane: police enforcement and traffic fatalities. Am. Econ. J. Econ. Policy 6:231–57 [Google Scholar]
  20. Deaton A. 20.  2010. Instruments, randomisation and learning about development. J. Econ. Lit. 48:424–55 [Google Scholar]
  21. Dundas R, Ouédraogo S, Bond L, Briggs AH, Chalmers J. 21.  et al. 2014. Evaluation of health in pregnancy grants in Scotland: a protocol for a natural experiment. BMJ Open 4:e006547 [Google Scholar]
  22. Dunning T. 22.  2012. Natural Experiments in the Social Sciences: A Design-Based Approach Cambridge, UK: Cambridge Univ. Press [Google Scholar]
  23. Dusheiko M, Gravelle H, Jacobs R, Smith P. 23.  2006. The effect of financial incentives on gatekeeping doctors: evidence from a natural experiment. J. Health Econ. 25:449–78 [Google Scholar]
  24. Eadie D, Heim D, MacAskill S, Ross A, Hastings G, Davies J. 24.  2008. A qualitative analysis of compliance with smoke-free legislation in community bars in Scotland: implications for public health. Addiction 103:1019–26 [Google Scholar]
  25. Fall T, Hägg S, Mägi R, Ploner A, Fischer K. 25.  et al. 2013. The role of adiposity in cardiometabolic traits: a Mendelian randomization analysis. PLOS Med 10:e1001474 [Google Scholar]
  26. 26. Farr Inst. Health Inf. Res. 2016. Environmental and public health research. Farr Inst. Health Inf. Res. Dundee, UK: http://www.farrinstitute.org/research-education/research/environmental-and-public-health [Google Scholar]
  27. 27. Foresight 2007. Tackling Obesities: Future Choices. Challenges for Research and Research Management. London: Gov. Off. Sci. [Google Scholar]
  28. Fuller T, Peters J, Pearson M, Anderson R. 28.  2014. Impact of the transparent reporting of evaluations with nonrandomized designs reporting guideline: ten years on. Am. J. Public Health 104:e110–17 [Google Scholar]
  29. Goodman A, van Sluijs EMF, Ogilvie D. 29.  2016. Impact of offering cycle training in schools upon cycling behaviour: a natural experimental study. Int. J. Behav. Nutr. Phys. Act. 13:34 [Google Scholar]
  30. Green CP, Heywood JS, Navarro M. 30.  2014. Did liberalising bar hours decrease traffic accidents?. J. Health Econ. 35:189–98 [Google Scholar]
  31. Grundy C, Steinbach R, Edwards P, Green J, Armstrong B. 31.  et al. 2009. Effect of 20 mph traffic speed zones on road injuries in London, 1986–2006: controlled interrupted time series analysis. BMJ 339:b4469 [Google Scholar]
  32. Gunnell D, Fernando R, Hewagama M, Priyangika W, Konradsen F, Eddleston M. 32.  2007. The impact of pesticide regulations on suicide in Sri Lanka. Int. J. Epidemiol. 36:1235–42 [Google Scholar]
  33. Heckman JJ. 33.  1995. Randomization as an instrumental variable. Rev. Econ. Stat. 78:336–41 [Google Scholar]
  34. Hernán M, Robins J. 34.  2017. Causal Inference Boca Raton, FL: Chapman Hall/CRC In press [Google Scholar]
  35. Hernán M, Robins JM. 35.  2006. Instruments for causal inference. An epidemiologist's dream. Epidemiology 17:360–72 [Google Scholar]
  36. Holmes MV, Dale CE, Zuccolo L, Silverwood RJ, Guo Y. 36.  et al. 2014. Association between alcohol and cardiovascular disease: Mendelian randomisation analysis based on individual participant data. BMJ 349g4164 [Google Scholar]
  37. 37. House Commons Sci. Technol. Comm 2016. The Big Data Dilemma. Fourth Report of Session 2015–16. HC 468 London: Station. Off. Ltd. [Google Scholar]
  38. Humphreys DK, Panter J, Sahlqvist S, Goodman A, Ogilvie D. 38.  2016. Changing the environment to improve population health: a framework for considering exposure in natural experimental studies. J. Epidemiol. Community Health. doi: 10.1136/jech-2015-206381 [Google Scholar]
  39. Ichida Y, Hirai H, Kondo K, Kawachi I, Takeda T, Endo H. 39.  2013. Does social participation improve self-rated health in the older population? A quasi-experimental intervention study. Soc. Sci. Med. 94:83–90 [Google Scholar]
  40. 40. IOM (Inst. Med.) 2010. Bridging the Evidence Gap in Obesity Prevention: A Framework to Inform Decision Making Washington, DC: Natl. Acad. Press [Google Scholar]
  41. Jones A, Rice N. 41.  2009. Econometric evaluation of health policies HEDG Work. Pap. 09/09 Univ. York [Google Scholar]
  42. Katikireddi SV, Bond L, Hilton S. 42.  2014. Changing policy framing as a deliberate strategy for public health advocacy: a qualitative policy case study of minimum unit pricing of alcohol. Milbank Q 92:250–83 [Google Scholar]
  43. Katikireddi SV, Der G, Roberts C, Haw S. 43.  2016. Has childhood smoking reduced following smoke-free public places legislation? A segmented regression analysis of cross-sectional UK school-based surveys. Nicotine Tob. Res. 18:1670–74 [Google Scholar]
  44. Kontopantelis E, Doran T, Springate DA, Buchan I, Reeves D. 44.  2015. Regression based quasi-experimental approach when randomisation is not an option: interrupted time series analysis. BMJ 350:h2750 [Google Scholar]
  45. Kreif N, Grieve R, Hangartner D, Nikolova S, Turner AJ, Sutton M. 45.  2015. Examination of the synthetic control method for evaluating health policies with multiple treated units. Health Econ doi: 10.1002/hec.3258 [Google Scholar]
  46. Labrecque JA, Kaufman JS. 46.  2016. Can a quasi-experimental design be a better idea than an experimental one?. Epidemiology 27:500–2 [Google Scholar]
  47. Lee DS, Lemieux T. 47.  2010. Regression discontinuity designs in economics. J. Econ. Lit. 48:281–355 [Google Scholar]
  48. Lewis SJ, Araya R, Davey Smith G, Freathy R, Gunnell D. 48.  et al. 2011. Smoking is associated with, but does not cause, depressed mood in pregnancy—a Mendelian randomization study. PLOS ONE 6:e21689 [Google Scholar]
  49. Linden A, Adams JL. 49.  2011. Applying a propensity score-based weighting model to interrupted time series data: improving causal inference in programme evaluation. J. Eval. Clin. Pract. 17:1231–38 [Google Scholar]
  50. Linden A, Adams JL. 50.  2012. Combining the regression discontinuity design and propensity score-based weighting to improve causal inference in program evaluation. J. Eval. Clin. Pract. 18:317–25 [Google Scholar]
  51. Little RJ, Rubin DB. 51.  2000. Causal effects in clinical and epidemiological studies via potential outcomes: concepts and analytical approaches. Annu. Rev. Public Health 21:121–45 [Google Scholar]
  52. Ludwig J, Miller D. 52.  2007. Does Head Start improve children's life chances? Evidence from an RD design. Q. J. Econ. 122:159–208 [Google Scholar]
  53. Mcleod AI, Vingilis ER. 53.  2008. Power computations in time series analyses for traffic safety interventions. Accid. Anal. Prev. 40:1244–48 [Google Scholar]
  54. Melhuish E, Belsky J, Leyland AH, Barnes J. 54. Natl. Eval. Sure Start Res. Team. 2008. Effects of fully-established Sure Start Local Programmes on 3-year-old children and their families living in England: a quasi-experimental observational study. Lancet 372:1641–47 [Google Scholar]
  55. Messer LC, Oakes JM, Mason S. 55.  2010. Effects of socioeconomic and racial residential segregation on preterm birth: a cautionary tale of structural confounding. Am. J. Epidemiol. 171:664–73 [Google Scholar]
  56. Moore G, Audrey S, Barker M, Bond L, Bonell C. 56.  et al. 2015. MRC process evaluation of complex intervention. Medical Research Council guidance. BMJ 350:h1258 [Google Scholar]
  57. Moscoe E, Bor J, Barnighausen T. 57.  2015. Regression discontinuity designs are under-used in medicine, epidemiology and public health: a review of current and best practice. J. Clin. Epidemiol. 68:132–43 [Google Scholar]
  58. Nandi A, Hajizadeh M, Harper S, Koski A, Strumpf EC, Heymann J. 58.  2016. Increased duration of paid maternity leave lowers infant mortality in low and middle-income countries: a quasi-experimental study. PLOS Med. 13:e1001985 [Google Scholar]
  59. Pega F, Blakely T, Glymour MM, Carter KN, Kawachi I. 59.  2016. Using marginal structural modelling to estimate the cumulative impact of an unconditional tax credit on self-rated health. Am. J. Epidemiol. 183:315–24 [Google Scholar]
  60. Phillips R, Amos A, Ritchie D, Cunningham-Burley S, Martin C. 60.  2007. Smoking in the home after the smoke-free legislation in Scotland: qualitative study. BMJ 335:553 [Google Scholar]
  61. Ramsay CR, Matowe L, Grilli R, Grimshaw JM, Thomas RE. 61.  2005. Interrupted time series designs in health technology assessment: lessons from two systematic reviews of behaviour change strategies. Int. J. Technol. Assess. Health Care 19:613–23 [Google Scholar]
  62. Restrepo BJ, Rieger M. 62.  2016. Denmark's policy on artificial trans fat and cardiovascular disease. Am. J. Prev. Med. 50:69–76 [Google Scholar]
  63. Rihoux B, Ragin C. 63.  2009. Configurational Comparative Methods: Qualitative Comparative Analysis (QCA) and Related Techniques London: Sage [Google Scholar]
  64. Robinson M, Geue C, Lewsey J, Mackay D, McCartney G. 64.  et al. 2014. Evaluating the impact of the Alcohol Act on off-trade alcohol sales: a natural experiment in Scotland. Addiction 109:2035–43 [Google Scholar]
  65. Rosenbaum PR, Rubin DB. 65.  1983. The central role of the propensity score in observational studies for causal effects. Biometrika 70:41–55 [Google Scholar]
  66. Rubin DB. 66.  2008. For objective causal inference, design trumps analysis. Ann. Appl. Stat. 2:808–40 [Google Scholar]
  67. Ryan AM, Krinsky S, Kontopantelis E, Doran T. 67.  2016. Long-term evidence for the effect of pay-for-performance in primary care on mortality in the UK: a population study. Lancet 388:268–74 [Google Scholar]
  68. Sanson-Fisher RW, D'Este CS, Carey ML, Noble N, Paul CL. 68.  2014. Evaluation of systems-oriented public health interventions: alternative research designs. Annu. Rev. Public Health 35:9–27 [Google Scholar]
  69. Shadish WR, Cook TD, Campbell DT. 69.  2002. Experimental and Quasi-Experimental Designs for Generalized Causal Inference New York: Houghton Mifflin [Google Scholar]
  70. Shah BR, Laupacis A, Hux JE, Austin PC. 70.  2005. Propensity score methods gave similar results to traditional regression modeling in observational studies: a systematic review. J. Clin. Epidemiol. 58:550–59 [Google Scholar]
  71. Swanson SA, Hernán MA. 71.  2013. Commentary: how to report instrumental variable analyses (suggestions welcome). Epidemiology 24:370–74 [Google Scholar]
  72. Thomas J, O'Mara-Eves A, Brunton G. 72.  2014. Using qualitative comparative analysis (QCA) in systematic reviews of complex interventions: a worked example. Syst. Rev. 3:67 [Google Scholar]
  73. van Leeuwen N, Lingsma HF, de Craen T, Nieboer D, Mooijaart S. 73.  et al. 2016. Regression discontinuity design: simulation and application in two cardiovascular trials with continuous outcomes. Epidemiology 27:503–11 [Google Scholar]
  74. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC. 74.  et al. 2008. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. J. Clin. Epidemiol. 61:344–49 [Google Scholar]
  75. Wagner AK, Soumerai SB, Zhang F, Ross-Degnan D. 75.  2002. Segmented regression analysis of interrupted time series studies in medication use research. J. Clin. Pharm. Ther. 27:299–309 [Google Scholar]
  76. Wanless D. 76.  2004. Securing Good Health for the Whole Population London: HM Treas. [Google Scholar]
  77. Warren J, Wistow J, Bambra C. 77.  2013. Applying qualitative comparative analysis (QCA) to evaluate a public health policy initiative in the North East of England. Policy Soc 32:289–301 [Google Scholar]
  78. Yen ST, Andrews M, Chen Z, Eastwood DB. 78.  2008. Food stamp program participation and food insecurity: an instrumental variables approach. Am. J. Agric. Econ. 90:117–32 [Google Scholar]
  79. Zhang F, Wagner AK, Ross-Degnan D. 79.  2011. Simulation-based power calculation for designing interrupted time series analyses of health policy interventions. J. Clin. Epidemiol. 64:1252–61 [Google Scholar]
/content/journals/10.1146/annurev-publhealth-031816-044327
Loading
/content/journals/10.1146/annurev-publhealth-031816-044327
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error