1932

Abstract

Systematic reviews are characterized by a methodical and replicable methodology and presentation. They involve a comprehensive search to locate all relevant published and unpublished work on a subject; a systematic integration of search results; and a critique of the extent, nature, and quality of evidence in relation to a particular research question. The best reviews synthesize studies to draw broad theoretical conclusions about what a literature means, linking theory to evidence and evidence to theory. This guide describes how to plan, conduct, organize, and present a systematic review of quantitative (meta-analysis) or qualitative (narrative review, meta-synthesis) information. We outline core standards and principles and describe commonly encountered problems. Although this guide targets psychological scientists, its high level of abstraction makes it potentially relevant to any subject area or discipline. We argue that systematic reviews are a key methodology for clarifying whether and how research findings replicate and for explaining possible inconsistencies, and we call for researchers to conduct systematic reviews to help elucidate whether there is a replication crisis.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-psych-010418-102803
2019-01-04
2024-03-29
Loading full text...

Full text loading...

/deliver/fulltext/psych/70/1/annurev-psych-010418-102803.html?itemId=/content/journals/10.1146/annurev-psych-010418-102803&mimeType=html&fmt=ahah

Literature Cited

  1. APA Publ. Commun. Board Work. Group J. Artic. Rep. Stand. 2008. Reporting standards for research in psychology: Why do we need them? What might they be?. Am. Psychol. 63:848–49
    [Google Scholar]
  2. Baumeister RF 2013. Writing a literature review. The Portable Mentor: Expert Guide to a Successful Career in Psychology MJ Prinstein, MD Patterson 119–32 New York: Springer, 2nd ed..
    [Google Scholar]
  3. Baumeister RF, Leary MR 1995. The need to belong: desire for interpersonal attachments as a fundamental human motivation. Psychol. Bull. 117:497–529
    [Google Scholar]
  4. Baumeister RF, Leary MR 1997. Writing narrative literature reviews. Rev. Gen. Psychol. 3:311–20Presents a thorough and thoughtful guide to conducting narrative reviews.
    [Google Scholar]
  5. Bem DJ 1995. Writing a review article for Psychological Bulletin. Psychol. Bull 118:172–77
    [Google Scholar]
  6. Borenstein M, Hedges LV, Higgins JPT, Rothstein HR 2009. Introduction to Meta-Analysis New York: WileyPresents a comprehensive introduction to meta-analysis.
  7. Borenstein M, Higgins JPT, Hedges LV, Rothstein HR 2017. Basics of meta-analysis: I2 is not an absolute measure of heterogeneity. Res. Synth. Methods 8:5–18
    [Google Scholar]
  8. Braver SL, Thoemmes FJ, Rosenthal R 2014. Continuously cumulating meta-analysis and replicability. Perspect. Psychol. Sci. 9:333–42
    [Google Scholar]
  9. Bushman BJ 1994. Vote-counting procedures. The Handbook of Research Synthesis H Cooper, LV Hedges 193–214 New York: Russell Sage Found.
    [Google Scholar]
  10. Cesario J 2014. Priming, replication, and the hardest science. Perspect. Psychol. Sci. 9:40–48
    [Google Scholar]
  11. Chalmers I 2007. The lethal consequences of failing to make use of all relevant evidence about the effects of medical treatments: the importance of systematic reviews. Treating Individuals: From Randomised Trials to Personalised Medicine PM Rothwell 37–58 London: Lancet
    [Google Scholar]
  12. Cochrane Collab. 2003. Glossary Rep., Cochrane Collab. London: http://community.cochrane.org/glossary Presents a comprehensive glossary of terms relevant to systematic reviews.
  13. Cohn LD, Becker BJ 2003. How meta-analysis increases statistical power. Psychol. Methods 8:243–53
    [Google Scholar]
  14. Cooper HM 2003. Editorial. Psychol. Bull. 129:3–9
    [Google Scholar]
  15. Cooper HM 2016. Research Synthesis and Meta-Analysis: A Step-by-Step Approach Thousand Oaks, CA: Sage, 5th ed..Presents a comprehensive introduction to research synthesis and meta-analysis.
  16. Cooper HM, Hedges LV, Valentine JC 2009. The Handbook of Research Synthesis and Meta-Analysis New York: Russell Sage Found, 2nd ed..
  17. Cumming G 2014. The new statistics: why and how. Psychol. Sci. 25:7–29Discusses the limitations of null hypothesis significance testing and viable alternative approaches.
    [Google Scholar]
  18. Earp BD, Trafimow D 2015. Replication, falsification, and the crisis of confidence in social psychology. Front. Psychol. 6:621
    [Google Scholar]
  19. Etz A, Vandekerckhove J 2016. A Bayesian perspective on the reproducibility project: psychology. PLOS ONE 11:e0149794
    [Google Scholar]
  20. Ferguson CJ, Brannick MT 2012. Publication bias in psychological science: prevalence, methods for identifying and controlling, and implications for the use of meta-analyses. Psychol. Methods 17:120–28
    [Google Scholar]
  21. Fleiss JL, Berlin JA 2009. Effect sizes for dichotomous data. The Handbook of Research Synthesis and Meta-Analysis H Cooper, LV Hedges, JC Valentine 237–53 New York: Russell Sage Found, 2nd ed..
    [Google Scholar]
  22. Garside R 2014. Should we appraise the quality of qualitative research reports for systematic reviews, and if so, how. Innovation 27:67–79
    [Google Scholar]
  23. Hedges LV, Olkin I 1980. Vote count methods in research synthesis. Psychol. Bull. 88:359–69
    [Google Scholar]
  24. Hedges LV, Pigott TD 2001. The power of statistical tests in meta-analysis. Psychol. Methods 6:203–17
    [Google Scholar]
  25. Higgins JPT, Green S 2011. Cochrane Handbook for Systematic Reviews of Interventions, Version 5.1.0 London: Cochrane Collab.Presents comprehensive and regularly updated guidelines on systematic reviews.
  26. John LK, Loewenstein G, Prelec D 2012. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol. Sci. 23:524–32
    [Google Scholar]
  27. Juni P, Witschi A, Bloch R, Egger M 1999. The hazards of scoring the quality of clinical trials for meta-analysis. JAMA 282:1054–60
    [Google Scholar]
  28. Klein O, Doyen S, Leys C, Magalhães de Saldanha da Gama PA, Miller S et al. 2012. Low hopes, high expectations: expectancy effects and the replicability of behavioral experiments. Perspect. Psychol. Sci. 7:6572–84
    [Google Scholar]
  29. Lau J, Antman EM, Jimenez-Silva J, Kupelnick B, Mosteller F, Chalmers TC 1992. Cumulative meta-analysis of therapeutic trials for myocardial infarction. N. Engl. J. Med. 327:248–54
    [Google Scholar]
  30. Light RJ, Smith PV 1971. Accumulating evidence: procedures for resolving contradictions among different research studies. Harvard Educ. Rev. 41:429–71
    [Google Scholar]
  31. Lipsey MW, Wilson D 2001. Practical Meta-Analysis London: SageComprehensive and clear explanation of meta-analysis.
  32. Matt GE, Cook TD 1994. Threats to the validity of research synthesis. The Handbook of Research Synthesis H Cooper, LV Hedges 503–20 New York: Russell Sage Found.
    [Google Scholar]
  33. Maxwell SE, Lau MY, Howard GS 2015. Is psychology suffering from a replication crisis? What does “failure to replicate” really mean?. Am. Psychol. 70:487–98
    [Google Scholar]
  34. Moher D, Hopewell S, Schulz KF, Montori V, Gøtzsche PC et al. 2010. CONSORT explanation and elaboration: updated guidelines for reporting parallel group randomised trials. BMJ 340:c869
    [Google Scholar]
  35. Moher D, Liberati A, Tetzlaff J, Altman DGPRISMA Group. 2009. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ 339:332–36Comprehensive reporting guidelines for systematic reviews.
    [Google Scholar]
  36. Morrison A, Polisena J, Husereau D, Moulton K, Clark M et al. 2012. The effect of English-language restriction on systematic review-based meta-analyses: a systematic review of empirical studies. Int. J. Technol. Assess. Health Care 28:138–44
    [Google Scholar]
  37. Nelson LD, Simmons J, Simonsohn U 2018. Psychology's renaissance. Annu. Rev. Psychol. 69:511–34
    [Google Scholar]
  38. Noblit GW, Hare RD 1988. Meta-Ethnography: Synthesizing Qualitative Studies Newbury Park, CA: Sage
  39. Olivo SA, Macedo LG, Gadotti IC, Fuentes J, Stanton T, Magee DJ 2008. Scales to assess the quality of randomized controlled trials: a systematic review. Phys. Ther. 88:156–75
    [Google Scholar]
  40. Open Sci. Collab. 2015. Estimating the reproducibility of psychological science. Science 349:943
    [Google Scholar]
  41. Paterson BL, Thorne SE, Canam C, Jillings C 2001. Meta-Study of Qualitative Health Research: A Practical Guide to Meta-Analysis and Meta-Synthesis Thousand Oaks, CA: Sage
  42. Patil P, Peng RD, Leek JT 2016. What should researchers expect when they replicate studies? A statistical view of replicability in psychological science. Perspect. Psychol. Sci. 11:539–44
    [Google Scholar]
  43. Rosenthal R 1979. The “file drawer problem” and tolerance for null results. Psychol. Bull. 86:638–41
    [Google Scholar]
  44. Rosnow RL, Rosenthal R 1989. Statistical procedures and the justification of knowledge in psychological science. Am. Psychol. 44:1276–84
    [Google Scholar]
  45. Sanderson S, Tatt ID, Higgins JP 2007. Tools for assessing quality and susceptibility to bias in observational studies in epidemiology: a systematic review and annotated bibliography. Int. J. Epidemiol. 36:666–76
    [Google Scholar]
  46. Schreiber R, Crooks D, Stern PN 1997. Qualitative meta-analysis. Completing a Qualitative Project: Details and Dialogue JM Morse 311–26 Thousand Oaks, CA: Sage
    [Google Scholar]
  47. Shrout PE, Rodgers JL 2018. Psychology, science, and knowledge construction: broadening perspectives from the replication crisis. Annu. Rev. Psychol. 69:487–510
    [Google Scholar]
  48. Stroebe W, Strack F 2014. The alleged crisis and the illusion of exact replication. Perspect. Psychol. Sci. 9:59–71
    [Google Scholar]
  49. Stroup DF, Berlin JA, Morton SC, Olkin I, Williamson GD et al. 2000. Meta-analysis of observational studies in epidemiology (MOOSE): a proposal for reporting. JAMA 283:2008–12
    [Google Scholar]
  50. Thorne S, Jensen L, Kearney MH, Noblit G, Sandelowski M 2004. Qualitative meta-synthesis: reflections on methodological orientation and ideological agenda. Qual. Health Res. 14:1342–65
    [Google Scholar]
  51. Tong A, Flemming K, McInnes E, Oliver S, Craig J 2012. Enhancing transparency in reporting the synthesis of qualitative research: ENTREQ. BMC Med. Res. Methodol. 12:181–88
    [Google Scholar]
  52. Trickey D, Siddaway AP, Meiser-Stedman R, Serpell L, Field AP 2012. A meta-analysis of risk factors for post-traumatic stress disorder in children and adolescents. Clin. Psychol. Rev. 32:122–38
    [Google Scholar]
  53. Valentine JC, Biglan A, Boruch RF, Castro FG, Collins LM et al. 2011. Replication in prevention science. Prev. Sci. 12:103–17
    [Google Scholar]
  54. Valentine JC, Cooper H 2005. Can we measure the quality of causal research in education?. Experimental Methods for Educational Interventions: Prospects, Pitfalls and Perspectives GD Phye, DH Robinson, J Levin 85–112 San Diego, CA: Elsevier
    [Google Scholar]
/content/journals/10.1146/annurev-psych-010418-102803
Loading
  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error