1932

Abstract

Public health researchers and practitioners are calling for greater focus on external validity, the ability to generalize findings of evidence-based interventions (EBIs) beyond the limited number of studies testing effectiveness. For public health, the goal is applicability: to translate, disseminate, and implement EBIs for an impact on population health. This article is a review of methods and how they might be combined to better assess external validity. The methods include () better description of EBIs and their contexts; () combining of statistical tools and logic to draw inferences about study samples; () sharper definition of the theory behind the intervention and core intervention components; and () more systematic consultation of practitioners. For population impact, studies should focus on context features that are likely to be both important (based on program theory) and frequently encountered by practitioners. Mixed-method programs of research will allow public health to expand causal generalizations.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-publhealth-031816-044509
2017-03-20
2024-06-18
Loading full text...

Full text loading...

/deliver/fulltext/publhealth/38/1/annurev-publhealth-031816-044509.html?itemId=/content/journals/10.1146/annurev-publhealth-031816-044509&mimeType=html&fmt=ahah

Literature Cited

  1. 1. Ariz. Public Health Train. Cent 2013. Pareto Chart YouTube video, 2:55, posted by West. Reg. Public Health Train. Cent., Nov. 22, https://youtu.be/pLWBG_CZ4ZY [Google Scholar]
  2. Avellar S, Paulsell D, Sama-Miller E, Del Grosso P, Akers L, Kleinman R. 2.  2015. Home Visiting Evidence of Effectiveness Review: Executive Summary Washington, DC: Off. Plan., Res. Eval., Adm. Children Fam., US Dep. Health Hum. Serv http://homvee.acf.hhs.gov/HomVEE_Executive_Summary_2015.pdf [Google Scholar]
  3. Avellar SA, Thomas J, Kleinman R, Sama-Miller E, Woodruff SE. 3.  et al. 2016. External validity: the next step for systematic reviews?. Eval. Rev. https://doi.org/10.1177/0193841X16665199. In press [Google Scholar]
  4. Backer TE. 4.  2002. Finding the Balance: Program Fidelity and Adaptation in Substance Abuse Prevention. A State of the Art Review. http://www.enap.ca/cerberus/files/nouvelles/documents/CREVAJ/Baker_2002.pdf [Google Scholar]
  5. Batalden M, Batalden P, Margolis P, Seid M, Armstrong G. 5.  et al. 2015. Coproduction of healthcare service. BMJ Qual. Saf. 25:509–17 [Google Scholar]
  6. Bishop DC, Pankratz MM, Hansen WB, Albritton J, Albritton L, Strack J. 6.  2014. Measuring fidelity and adaptation: reliability of an instrument for school-based prevention programs. Eval. Health Prof. 37:231–57 [Google Scholar]
  7. Blakely CH, Mayer JP, Gottschalk RG, Schmitt N, Davidson WS. 7.  et al. 1987. The fidelity adaptation debate: implications for the implementation of public sector social programs. Am. J. Community Psychol. 15:253–68 [Google Scholar]
  8. Blase K, Fixsen D. 8.  2013. Core Intervention Components: Identifying and Operationalizing What Makes Programs Work Washington, DC: Off. Assist. Secr. Plan. Eval., Dep. Health Hum. Serv https://aspe.hhs.gov/basic-report/core-intervention-components-identifying-and-operationalizing-what-makes-programs-work [Google Scholar]
  9. Brown R. 9.  2016. Producing better evidence for building a culture of health Presented at the Robert Wood Johnson Found. Conf. Shar. Knowl. Build Cult. Health, 1st, March 10, Baltimore, Md. [Google Scholar]
  10. Burchett H, Umoquit M, Dubrow M. 10.  2011. How do we know when research from one setting can be useful in another? A review of external validity, applicability and transferability frameworks. J. Health Serv. Res. Policy 16:238–44 [Google Scholar]
  11. Campbell DT, Stanley JC. 11.  1966. Experimental and Quasi-Experimental Designs for Research Chicago: Rand McNally [Google Scholar]
  12. 12. CDC (Cent. Dis. Control Prev.) 2015. CDC Community Health Improvement Navigator CDC, Atlanta. http://www.cdc.gov/chinav/ [Google Scholar]
  13. 13. CDC (Cent. Dis. Control Prev.) 2015. High impact HIV/AIDS Prevention Project (HIP) is CDC's approach to reducing HIV infections in the United States. CDC, Atlanta. https://effectiveinterventions.cdc.gov/en/HighImpactPrevention/Interventions.aspx [Google Scholar]
  14. 14. CDC (Cent. Dis. Control Prev.) 2016. Compendium of evidence-based interventions and best practices for HIV prevention. Updated Sept. 1, CDC, Atlanta. http://www.cdc.gov/hiv/prevention/research/compendium/index.html [Google Scholar]
  15. 15. Cent. Study Prev. Violence 2016. Blueprints for Healthy Youth Development. Inst. Behav. Sci., Univ. Colo., Boulder. http://www.colorado.edu/cspv/blueprints/ [Google Scholar]
  16. Chambers DA, Glasgow RE, Stange KC. 16.  2013. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement. Sci. 8:117 [Google Scholar]
  17. Chen HT. 17.  2010. The bottom-up approach to integrative validity: a new perspective for program evaluation. Eval. Prog. Plan. 33:205–14 [Google Scholar]
  18. Chinman M, Acosta J, Ebener P, Malone PS, Slaughter M. 18.  2016. Can implementation support help community-based settings deliver evidence-based sexual health promotion programs? A randomized trial of Getting To Outcomes®. Implement. Sci. 11:78 [Google Scholar]
  19. Collins LM. 19.  2016. Multiphase optimization strategy (MOST). Methodol. Cent., Pa. State Univ., University Park. https://methodology.psu.edu/ra/most/ [Google Scholar]
  20. 20. Community Prev. Serv. Task Force 2016. The Guide to Community Preventive Services. Updated Sept. 26 Atlanta: Community Guide http://www.thecommunityguide.org/index.html [Google Scholar]
  21. 21. Community Prev. Serv. Task Force 2016. Using Evidence to Improve Health Outcomes: Annual Report to Congress, Federal Agencies, and Prevention Stakeholders. Atlanta: Community Guide http://www.thecommunityguide.org/annualreport/ [Google Scholar]
  22. Cook TD. 22.  2014. Generalizing causal knowledge in the policy sciences: external validity as a task of both multiattribute representation and multiattribute extrapolation. J. Policy Anal. Manag. 33:527–36 [Google Scholar]
  23. Cook TD, Campbell DT. 23.  1979. Quasi-Experimentation: Design and Analysis Issues for Field Settings New York: Rand McNally [Google Scholar]
  24. Copi IM, Cohen C, Flage DE. 24.  2007. Essentials of Logic Upper Saddle RiverNJ: Pearson Educ, 2nd ed.. [Google Scholar]
  25. 25. County Health Rankings and Roadmaps 2016. What Works for Health Univ. Wis. Popul. Health Inst., Madison. http://www.countyhealthrankings.org/roadmaps/what-works-for-health [Google Scholar]
  26. Cronbach LJ, Shapiro K. 26.  1982. Designing Evaluations of Educational and Social Programs San Francisco: Jossey-Bass [Google Scholar]
  27. Dane AV, Schneider BH. 27.  1998. Program integrity in primary and early secondary prevention: Are implementation effects out of control?. Clin. Psychol. Rev. 18:23–45 [Google Scholar]
  28. Davidoff F, Dixon-Woods M, Leviton L, Michie S. 28.  2015. Demystifying theory and its use in improvement. BMJ Qual. Saf. 24:228–38 [Google Scholar]
  29. Del Grosso P, Kleinman R, Mraz Esposito A, Sama-Miller E, Paulsell D. 29.  2014. Assessing the Evidence of Effectiveness of Home Visiting Program Models Implemented in Tribal Communities Washington, DC: Off. Plan., Res. Eval., Adm. Child Fam., US Dep. Health Hum. Serv https://www.mathematica-mpr.com/our-publications-and-findings/publications/assessing-the-evidence-of-effectiveness-of-home-visiting-program-models-implemented-in-tribal-communities [Google Scholar]
  30. Durlak JA, DuPre EP. 30.  2008. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am. J. Community Psychol. 41:327–50 [Google Scholar]
  31. Dusenbury L, Brannigan R, Hansen WB, Walsh J, Falco M. 31.  2005. Quality of implementation: developing measures crucial to understanding the diffusion of preventive interventions. Health Educ. Res. 20:308–13 [Google Scholar]
  32. Elliott DS, Mihalic S. 32.  2004. Issues in disseminating and replicating effective prevention programs. Prev. Sci. 5:47–53 [Google Scholar]
  33. Finucane MM, Martinez I, Cody S. 33.  2015. What Works for Whom? A Bayesian Approach to Channeling Big Data Streams for Policy Analysis. Work. Pap. No. 40 Princeton, NJ: Mathematica Policy Res https://www.mathematica-mpr.com/-/media/publications/pdfs/health/bayesian_approach_channeling_wp.pdf [Google Scholar]
  34. Freire KE, Perkinson L, Morrel-Samuels S, Zimmerman MA. 34.  2015. Three Cs of translating evidence-based programs for youth and families to practice settings. N. Dir. Child Adolesc. Dev. 2015:25–39 [Google Scholar]
  35. Glanz K, Bishop DB. 35.  2010. The role of behavioral science theory in development and implementation of public health interventions. Annu. Rev. Public Health 31:399–418 [Google Scholar]
  36. Glasgow RE, Vogt TM, Boles SM. 36.  1999. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am. J. Public Health 89:1323–27 [Google Scholar]
  37. Glasziou P, Chalmers I, Altman DG, Bastian H, Boutron I. 37.  et al. 2010. Taking healthcare interventions from trial to practice. BMJ 341:c3852 [Google Scholar]
  38. Godwin M, Ruhland L, Casson I, MacDonald S, Delva D. 38.  et al. 2003. Pragmatic controlled clinical trials in primary care: the struggle between external and internal validity. BMC Med. Res. Methodol. 3:28 [Google Scholar]
  39. Green LW. 39.  2001. From research to “best practices” in other settings and populations. Am. J. Health Behav. 25:165–78 [Google Scholar]
  40. Green LW, Glasgow RE. 40.  2006. Evaluating the relevance, generalization, and applicability of research: issues in external validation and translation methodology. Eval. Health Prof. 29:126–53 [Google Scholar]
  41. Green LW, Glasgow RE, Atkins D, Stange K. 41.  2009. Making evidence from research more relevant, useful, and actionable in policy, program planning, and practice: slips “twixt cup and lip.”. Am. J. Prev. Med. 37:6S1S187–91 [Google Scholar]
  42. Green LW, Nasser M. 42.  2012. Furthering dissemination and implementation research: the need for more attention to external validity. Dissemination and Implementation Research in Health: Translating Science to Practice RC Brownson, GA Colditz, EK Proctor 305–26 New York: Oxford Univ. Press [Google Scholar]
  43. Greenhalgh T, Robert G, MacFarlane F, Bate P, Kyriakidou O. 43.  2004. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q 82:581–629 [Google Scholar]
  44. Hawe P. 44.  2015. Lessons from complex interventions to improve health. Annu. Rev. Public Health 36:307–23 [Google Scholar]
  45. Hill LG, Maucione K, Hood BK. 45.  2007. A focused approach to assessing program fidelity. Prev. Sci. 8:25–34 [Google Scholar]
  46. Hoffmann T, Glasziou P, Boutron I, Milne R, Perera R. 46.  et al. 2014. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ 348:g1687 [Google Scholar]
  47. Horne CS. 47.  2016. Assessing and strengthening evidence-based program registries’ usefulness for social service program replication and adaptation. Eval. Rev. https://doi.org/10.1177/0193841X15625014. In press [Google Scholar]
  48. Hulleman C, Cordray DS. 48.  2009. Moving from the lab to the field: the role of fidelity and achieved relative intervention strength. J. Res. Educ. Eff. 2:88–110 [Google Scholar]
  49. Jagers RJ, Sydnor K, Mouttapa M, Flay BR. 49.  2007. Protective factors associated with preadolescent violence: preliminary work on a cultural model. Am. J. Community Psychol. 40:138–45 [Google Scholar]
  50. Kessler R, Glasgow RE. 50.  2011. A proposal to speed translation of healthcare research into practice. Am. J. Prev. Med. 40:637–44 [Google Scholar]
  51. Kruschke JK. 51.  2011. Doing Bayesian Data Analysis: A Tutorial with R and BUGS Burlington, MA: Academic [Google Scholar]
  52. Leviton LC. 52.  2015. External validity. International Encyclopedia of the Social and Behavioral Sciences JD Wright 617–22 Oxford, UK: Elsevier, 2nd ed.. [Google Scholar]
  53. Leviton LC, Guinan ME. 53.  2003. HIV prevention and the evaluation of public health programs. Dawning Answers: How the HIV/AIDS Epidemic Has Helped to Strengthen Public Health RO Valdiserri 155–76 Oxford, UK: Oxford Univ. Press [Google Scholar]
  54. Leviton LC, Trujillo MD. 54.  2016. Interaction of theory and practice to assess external validity. Eval. Rev. https://doi.org/10.1177/0193841X15625289. In press [Google Scholar]
  55. March JG. 55.  1994. A Primer on Decision Making: How Decisions Happen New York: Free Press [Google Scholar]
  56. Mercer SM, DeVinney BJ, Fine LJ, Green LW, Dougherty D. 56.  2007. Study designs for effectiveness and translation research: identifying trade-offs. Am. J. Prev. Med. 33:2139–54 [Google Scholar]
  57. Minkovitz CS, O'Neill KMG, Duggan AK. 57.  2016. Home visiting: a service strategy to reduce poverty and mitigate its consequences. Acad. Pediatrics 16:S105–11 [Google Scholar]
  58. 58.  Deleted in proof
  59. Morrel-Samuels S, Hutchison P, Perkinson L, Bostic B, Zimmerman M. 59.  2014. Selecting, Implementing and Adapting Youth Empowerment Solutions Ann Arbor: Univ. Mich. Sch. Public Health [Google Scholar]
  60. Naylor MD, Feldman PH, Keating S, Koren MJ, Kurtzman ET. 60.  et al. 2009. Translating research into practice: transitional care for older adults. J. Eval. Clin. Pract. 15:11–70 [Google Scholar]
  61. Newman MEJ. 61.  2005. Power laws, Pareto distributions and Zipf's law. Contemp. Phys. 46:323–51 [Google Scholar]
  62. 62. NHS Scotl 2008. The Glenday Sieve. NHS Scotl., Edinburgh. www.qihub.scot.nhs.uk/media/215450/glenday_sieve_presentation.ppt [Google Scholar]
  63. Olds DL, Henderson CR Jr, Tatelbaum R, Chamberlin R. 63.  1986. Improving the delivery of prenatal care and outcomes of pregnancy: a randomized trial of nurse home visitation. Pediatrics 77:16–28 [Google Scholar]
  64. Patton MQ. 64.  2010. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use New York: Guilford [Google Scholar]
  65. Pawson R, Tilley N. 65.  1997. Realistic Evaluation Thousand Oaks, CA: Sage [Google Scholar]
  66. Pearson M, Coomber R. 66.  2010. The challenge of external validity in policy-relevant systematic reviews: a case study from the field of substance misuse. Addiction 105:136–45 [Google Scholar]
  67. Perkinson L. 67.  2012. Environmental scan of adaptation guidance Unpubl. Rep., CDC Found., Atlanta [Google Scholar]
  68. Peters J, Langbein J, Roberts G. 68.  2015. Policy Evaluation, Randomized Controlled Trials, and External Validity: A Systematic Review Ruhr Econ. Pap. 589 Bochum, Ger: Ruhr Econ. Pap http://en.rwi-essen.de/media/content/pages/publikationen/ruhr-economic-papers/rep_15_589.pdf [Google Scholar]
  69. Robling M, Bekkers M-J, Bell K, Butler CC, Cannings-John R. 69.  et al. 2016. Effectiveness of a nurse-led intensive home-visitation programme for first-time teenage mothers (Building Blocks): a pragmatic randomised controlled trial. Lancet 387:146–55 [Google Scholar]
  70. Rogers E. 70.  2003. Diffusion of Innovations New York: Free Press, 5th ed.. [Google Scholar]
  71. Rohrbach LA, Grana R, Sussman S, Valente TW. 71.  2006. Type II translation: transporting prevention interventions from research to real-world settings. Eval. Health Prof. 29:302–33 [Google Scholar]
  72. Rothwell PM. 72.  2005. External validity of randomised controlled trials: “To whom do the results of this trial apply?”. Lancet 365:82–93 [Google Scholar]
  73. 73. SAMHSA (Subst. Abuse Ment. Health Serv. Adm.) 2014. Evidence based programs/NREPP SAMHSA, Rockville, Md. http://www.samhsa.gov/data/evidence-based-programs-nrepp [Google Scholar]
  74. Schatz F, Welle K. 74.  2016. Qualitative Comparative Analysis: A Valuable Approach to Add to the Evaluator's ‘Toolbox’? Lessons from Recent Applications CDI Pract. Pap. 13 Brighton, UK: Inst. Dev. Stud https://www.ids.ac.uk/F3B60FE0-C360-11E5-86F9005056AA4991 [Google Scholar]
  75. Schouten LM, Hulscher ME, van Everdingen JJ, Huijsman R, Grol RP. 75.  2008. Evidence for the impact of quality improvement collaboratives: systematic review. BMJ 336:1491–94 [Google Scholar]
  76. Shadish WR, Cook TD, Campbell DT. 76.  2002. Experimental and Quasiexperimental Designs for Generalized Causal Inference Boston, MA: Houghton Mifflin [Google Scholar]
  77. Shadish WR, Cook TD, Leviton LC. 77.  1991. Foundations of Program Evaluation: Theorists and Their Theories Newbury Park, CA: Sage [Google Scholar]
  78. Steckler A, McLeroy KR. 78.  2008. The importance of external validity. Am. J. Public Health 98:9–10 [Google Scholar]
  79. Sundell K, Beelmann A, Hasson H, von Thiele Schwarz U. 79.  2015. Novel programs, international adoptions, or contextual adaptations? Meta-analytical results from German and Swedish intervention research. J. Clin. Child Adolescent Psychol. 45784–96 [Google Scholar]
  80. Tipton E. 80.  2013. Improving generalizations from experiments using propensity score subclassification: assumptions, properties, and contexts. J. Educ. Behav. Stat. 38:239–66 [Google Scholar]
  81. Tipton E, Hallberg K, Hedges LV, Chan W. 81.  2016. Implications of small samples for generalization: adjustments and rules of thumb. Eval. Rev. https://doi.org/10.1177/0193841X16655665. In press [Google Scholar]
  82. Tipton E, Hedges LV, Vaden-Kiernan M, Borman GD, Sullivan K, Caverly S. 82.  2014. Sample selection in randomized experiments: a new method using propensity score stratified sampling. J. Res. Educ. Eff. 7:114–35 [Google Scholar]
  83. Tipton E, Peck LR. 83.  2016. A design-based approach to improve external validity in welfare policy evaluations. Eval. Rev. https://doi.org/10.1177/0193841X16655656. In press [Google Scholar]
  84. 84. US Prev. Serv. Task Force 2014. The Guide to Clinical Preventive Services 2014. Rockville, MD: Agency Healthc. Res. Qual. (AHRQ) http://www.uspreventiveservicestaskforce.org/Home/GetFileByID/989 [Google Scholar]
  85. Wandersman A, Alia K, Cook BS, Hsu LS, Ramaswamy R. 85.  2016. Evidence-based interventions are necessary but not sufficient for achieving outcomes in each setting in a complex world: Empowerment Evaluation, Getting To Outcomes, and demonstrating accountability. Am. J. Eval. https://doi.org/10.1177/1098214016660613. In press [Google Scholar]
  86. Wang VL, Ephross PH, Green LW. 86.  1975. The point of diminishing returns in nutrition education through home visits by aides: an evaluation of EFNEP. Health Educ. Monogr. 3:70–88 [Google Scholar]
  87. 87. Wash. State Inst. Public Policy 2015. Interventions to promote health and increase health care efficiency: December 2015 update. Wash. State Inst. Public Policy, Olympia. http://www.wsipp.wa.gov/ReportFile/1622/Wsipp_Interventions-to-Promote-Health-and-Increase-Health-Care-Efficiency-December-2015-Update_Report.pdf [Google Scholar]
  88. Weiss CH. 88.  1997. Evaluation: Methods for Studying Programs and Policies New York: Prentice Hall, 2nd ed.. [Google Scholar]
  89. Williams SJ, Aitken J, Radnor Z, Esain A. 89.  2016. Patient-centric and process-centric healthcare supply chains: the role of the broker Presented at Int. Organ. Behav. Healthc. Conf., 10th, April 5–6, Cardiff, UK [Google Scholar]
/content/journals/10.1146/annurev-publhealth-031816-044509
Loading
/content/journals/10.1146/annurev-publhealth-031816-044509
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error