1932

Abstract

As part of a broader methodological reform movement, scientists are increasingly interested in improving the replicability of their research. Replicability allows others to perform replications to explore potential errors and statistical issues that might call the original results into question. Little attention, however, has been paid to the state of replicability in the field of empirical legal research (ELR). Quality is especially important in this field because empirical legal researchers produce work that is regularly relied upon by courts and other legal bodies. In this review, we summarize the current state of ELR relative to the broader movement toward replicability in the social sciences. As part of that aim, we summarize recent collective replication efforts in ELR and transparency and replicability guidelines adopted by journals that publish ELR. Based on this review, ELR seems to be lagging other fields in implementing reforms. We conclude with suggestions for reforms that might encourage improved replicability.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-lawsocsci-121620-085055
2021-10-13
2024-03-28
Loading full text...

Full text loading...

/deliver/fulltext/lawsocsci/17/1/annurev-lawsocsci-121620-085055.html?itemId=/content/journals/10.1146/annurev-lawsocsci-121620-085055&mimeType=html&fmt=ahah

Literature Cited

  1. Aczel B, Szaszi B, Sarafoglu A, Kekecs Z, Kucharský Š et al. 2020. A consensus-based transparency checklist. Nat. Hum. Behav. 4:4–6
    [Google Scholar]
  2. Am. Econ. Assoc 2020. AEA RCT Registry https://www.socialscienceregistry.org/
  3. Argyrou A. 2017. Making the case for case studies in empirical legal research. Utrecht Law Rev 13:95–113
    [Google Scholar]
  4. Arlen J, Talley EL 2008. Introduction. Experimental Law and Economics J Arlen, EL Talley xv–lxi Cheltenham, UK: Edward Elgar
    [Google Scholar]
  5. Bass ZJ, Ulrich AC, McLeod RB, Qiao K, Dynak J et al. 2020. Editorial, the need for collective standards: validating raw data in legal empirical analysis. NYU J. Intellect. Prop. Entertain. Law 10:40–42
    [Google Scholar]
  6. Bavli HJ. 2022. Credibility in empirical legal analysis. Brooklyn Law Rev. 87: In press
    [Google Scholar]
  7. Beerdsen E. 2020. Litigation science after the knowledge crisis. Cornell Law Rev 106:529–90
    [Google Scholar]
  8. Bolboacă SD, Buhai D-V, Aluaș M, Bulboacă AE 2019. Post retraction citations among manuscripts reporting a radiology-imaging diagnostic method. PLOS ONE 14:e0217918
    [Google Scholar]
  9. Brady J, Evans MF, Wehrly EW. 2019. Reputational penalties for environmental violations: a pure and scientific replication study. Int. Rev. Law Econ. 57:60–72
    [Google Scholar]
  10. Bruns SB, Ioannidis JPA. 2016. p-Curve and p-hacking in observational research. PLOS ONE 11:e0149144
    [Google Scholar]
  11. Butler N, Delaney H, Spoelstra S. 2017. The gray zone: questionable research practices in the business school. Acad. Manag. Learn. Educ. 16:94–109
    [Google Scholar]
  12. Camerer CF, Dreber A, Forsell E, Ho T-H, Huber J, Johannesson M et al. 2016. Evaluating replicability of laboratory experiments in economics. Science 351:1433–36
    [Google Scholar]
  13. Camerer CF, Dreber A, Holzmeister F, Ho T-H, Huber J et al. 2018. Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nat. Hum. Behav. 2:637–44
    [Google Scholar]
  14. Caspi A, Stiglitz EH. 2020. Measuring discourse by algorithm. Int. Rev. Law Econ. 62:105863
    [Google Scholar]
  15. Cent. Open Sci 2020a. APA joins as new signatory to TOP guidelines. News Rel., Nov. 10. https://www.cos.io/about/news/apa-joins-as-new-signatory-to-top-guidelines
  16. Cent. Open Sci 2020b. Transparency and openness guidelines. https://www.cos.io/our-services/top-guidelines
  17. Chambers C. 2019. What's next for Registered Reports?. Nature 573:187–89
    [Google Scholar]
  18. Chang AC, Li P. 2021. Is economics research replicable? Sixty published papers from thirteen journals say “often not. ”. Crit. Finance Rev. 10:104 http://dx.doi.org/10.1561/104.00000053
    [Crossref] [Google Scholar]
  19. Chin JM, Ribeiro G, Rairden A. 2019. Open forensic science. J. Law Biosci. 6:255–58
    [Google Scholar]
  20. Christensen G, Freese J, Miguel EA. 2019a. Transparent and Reproducible Social Science Research: How to Do Open Science Oakland: Univ. Calif. Press
  21. Christensen G, Miguel E 2018. Transparency, reproducibility, and the credibility of economics research. J. Econ. Lit. 56:920–80
    [Google Scholar]
  22. Christensen G, Wang Z, Paluck EL, Swanson N, Birke D et al. 2019b. Open science practices are on the rise: The State of Social Science (3S) Survey MetaArxiv. https://doi.org/10.31222/osf.io/5rksu
    [Crossref]
  23. Clemens MA. 2015. The meaning of failed replications: a review and proposal. J. Econ. Surv. 31:326–42
    [Google Scholar]
  24. Closen ML, Jarvis RM. 1992. The National Conference of Law Reviews Model Code of Ethics: final text and comments. Marquette Law Rev 75:509–28
    [Google Scholar]
  25. Collabra: Psychology 2020. Aims and scope. . https://online.ucpress.edu/collabra/pages/About
  26. Dewald WG, Thursby JG, Anderson RG. 1986. Replication in empirical economics: the Journal of Money, Credit and Banking Project. Am. Econ. Rev. 76:587–603
    [Google Scholar]
  27. Diamond SS, Mueller P. 2010. Empirical legal scholarship in law reviews. Annu. Rev. Law Soc. Sci. 6:581–99
    [Google Scholar]
  28. Doleac JL, Temple C, Pritchard D, Roberts A. 2020. Which prisoner reentry programs work? Replicating and extending analyses of three RCTs. Int. Rev. Law Econ. 62:105902 https://doi.org/10.1016/j.irle.2020.105902
    [Crossref] [Google Scholar]
  29. Donohue JJ. 2015. Empirical evaluation of law: the dream and the nightmare. Am. Law Econ. Rev. 17:313–60
    [Google Scholar]
  30. Dorf MC. 2005. Thanks to a joint statement by top law journals, law review articles will get shorter, but will they get better? FindLaw. Feb. 28. https://supreme.findlaw.com/legal-commentary/thanks-to-a-joint-statement-by-top-law-journals-law-review-articles-will-get-shorter-but-will-they-get-better.html
  31. Dutilh G, Sarafoglou A, Wagenmakers EJ. 2019. Flexible yet fair: blinding analyses in experimental psychology. Synthese https://doi.org/10.1007/s11229-019-02456-7
    [Crossref] [Google Scholar]
  32. Ebersole CR, Atherton OE, Belanger AL, Skulborstad HM, Allen JM et al. 2016. Many Labs 3: evaluating participant pool quality across the academic semester via replication. J. Exp. Soc. Psychol. 67:68–82
    [Google Scholar]
  33. Epstein L, King G. 2002a. The rules of inference. Univ. Chicago Law Rev. 69:11–133
    [Google Scholar]
  34. Epstein L, King G. 2002b. A reply. Univ. Chicago Law Rev. 69:191–210
    [Google Scholar]
  35. Epstein L, Martin AD. 2014. An Introduction to Empirical Legal Research Oxford, UK: Oxford Univ. Press, 1st ed..
  36. Feldman R, Lemley M, Masur J, Rai AK. 2016. Open letter on ethical norms in intellectual property scholarship. Harvard J. Law Technol. 29:1–14
    [Google Scholar]
  37. Franco A, Malhotra N, Simonovits G. 2015. Underreporting in political science survey experiments: comparing questionnaires to published results. Political Anal 23:306–12
    [Google Scholar]
  38. Fraser H, Parker T, Nakagawa S, Barnett A, Fidler F. 2018. Questionable research practices in ecology and evolution. PLOS ONE 13:e0200303
    [Google Scholar]
  39. Freese J, Peterson D. 2017. Replication in social science. Annu. Rev. Sociol. 43:147–65
    [Google Scholar]
  40. Gelman A. 2017. Ethics and statistics: Honesty and transparency are not enough. Chance 30:37–39
    [Google Scholar]
  41. Gentzkow M, Shapiro JM. 2014. Code and data for the social sciences: a practitioner's guide Mimeo, Univ. Chicago http://web.stanford.edu/∼gentzkow/research/CodeAndData.pdf
  42. Giner-Sorolla R, Amodio DM, van Kleef GA. 2018. Three strong moves to improve research and replications alike. Behav. Brain Sci. 41:e130
    [Google Scholar]
  43. Glandon P. 2010. Report on the American Economic Review Data Availability Compliance Project Publ., Kenyon Coll Gambier, OH: http://digital.kenyon.edu/cgi/viewcontent.cgi?article=1011&context=economics_publications
  44. Han S, Olonisakin TF, Pribis JP, Zupetic J, Yoon JH et al. 2017. A checklist is associated with increased quality of reporting preclinical biomedical research: a systematic review. PLOS ONE 12:e0183591
    [Google Scholar]
  45. Hardwicke TE, Mathur MB, MacDonald K, Nilsonne G, Banks GC et al. 2018. Data availability, reusability, and analytic reproducibility: evaluating the impact of a mandatory open data policy at the journal Cognition. R. Soc. Open Sci 5:180448
    [Google Scholar]
  46. Hardwicke TE, Serghiou S, Janiaud P, Danchev V, Crüwell S et al. 2020a. Calibrating the scientific ecosystem through meta-research. Annu. Rev. Stat. Appl. 7:11–37
    [Google Scholar]
  47. Hardwicke TE, Wallach JD, Kidwell MC, Bendixen T, Crüwell S, Ioannidis JPA. 2020b. An empirical assessment of transparency and reproducibility-related research practices in the social sciences. R. Soc. Open Sci. 7:190806
    [Google Scholar]
  48. Harrison JL, Mashburn AR. 2015. Citations, justifications, and the troubled state of legal scholarship: an empirical study. Texas A&M Law Rev 3:45–90
    [Google Scholar]
  49. Hausladen CI, Schubert MH, Ash E 2020. Text classification of ideological direction in judicial opinions. Int. Rev. Law Econ. 62:105903
    [Google Scholar]
  50. Hazelton MW, Hinkle RK, Martin AD. 2010. On replication and the study of the Louisiana Supreme Court. Glob. Jurist 10:85–91
    [Google Scholar]
  51. Heise M. 2011. An empirical analysis of empirical legal scholarship production, 1990–2009. Univ. Ill. Law Rev. 2011.1739–52
    [Google Scholar]
  52. Hoeppner S. 2019. A note on replication analysis. Int. Rev. Law Econ. 59:98–102
    [Google Scholar]
  53. Holcombe A. 2019. Farewell authors, hello contributors. Nature 571:147
    [Google Scholar]
  54. Horbach SPJM, Halffman W. 2018. The changing forms and expectations of peer review. Res. Integr. Peer Rev. 3:8
    [Google Scholar]
  55. Howe PDL, Perfors A. 2018. An argument for how (and why) to incentivise replication. Behav. Brain Sci. 41:e135
    [Google Scholar]
  56. Hubbard WHJ. 2019. A replication study worth replicating: a comment on Salmanowitz and Spamann. Int. Rev. Law Econ. 58:1–2
    [Google Scholar]
  57. Hubbard WHJ, Hyman D. 2019. What's in a name? A taxonomy of replication. Int. Rev. Law Econ. 60:1–3
    [Google Scholar]
  58. Ioannidis JPA. 2005. Why most published research findings are false. PLOS Med 2:e124
    [Google Scholar]
  59. Ioannidis JPA. 2018. Why replication has more scientific value than original discovery. Behav. Brain Sci. 41:e137
    [Google Scholar]
  60. Irvine K, Hoffman D, Wilkinson-Ryan T. 2018. Law and psychology grows up, goes online, and replicates. J. Empir. Leg. Stud. 15:2320–55
    [Google Scholar]
  61. John LK, Loewenstein G, Prelec D. 2012. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol. Sci. 23:524–32
    [Google Scholar]
  62. Kidwell MC, Lazarević LB, Baranski E. 2016. Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency. PLOS Biol 14:e1002456
    [Google Scholar]
  63. Klein O, Hardwicke TE, Aust F. 2018a. A practical guide for transparency in psychological science. Collabra Psychol 4:20
    [Google Scholar]
  64. Klein RA, Ratliff KA, Vianello M, Adams RB Jr., Bahník S. 2014. Investigating variation in replicability a “many labs” replication project. Soc. Psychol. 45:142–52
    [Google Scholar]
  65. Klein RA, Vianello M, Hasselman F, Adams BG, Adams RB et al. 2018b. Many Labs 2: investigating variation in replicability across samples and settings. Adv. Meth. Pract. Psychol. Sci. 1:443–90
    [Google Scholar]
  66. Koenker R, Zeileis A. 2009. On reproducible econometric research. J. App. Econ. 24:833–47
    [Google Scholar]
  67. Kornhauser L, Lu Y, Tontrup S. 2020. Testing a fine is a price in the lab. Int. Rev. Law Econ. 63:105931
    [Google Scholar]
  68. Kvarven A, Strømland E, Johannesson M. 2020. Comparing meta-analyses and preregistered multiple-laboratory replication projects. Nat. Hum. Behav. 4:423–34
    [Google Scholar]
  69. Lawless RM, Robbennolt JK, Ulen T. 2010. Empirical Methods in Law New York: Aspen Publ.
  70. Learner EE. 1983. Let's take the con out of econometrics. Am. Econ. Rev. 73:31–43
    [Google Scholar]
  71. LoPucki LM. 2015. Disciplining legal scholarship. Tulane Law Rev 90:1–34
    [Google Scholar]
  72. MacCoun R. 2018. Enhancing research credibility when replication is not feasible. Behav. Brain Sci. 41:e142
    [Google Scholar]
  73. MacCoun R 2021. p-hacking: a strategic analysis. Research Integrity in the Behavioral Sciences L Jussim, ST Stevens, JA Krosnick New York: Oxford Univ. Press. In press
    [Google Scholar]
  74. MacCoun R, Perlmutter S. 2015. Hide results to seek truth. Nature 526:187–89
    [Google Scholar]
  75. Makel MC, Hodges J, Cook BG, Plucker JA. 2021. Both questionable and open research practices are prevalent in education research. Educ. Res. 1: https://doi.org/10.3102/0013189X211001356
    [Crossref] [Google Scholar]
  76. McCullough BD. 2009. Open access economics journals and the market for reproducible economic research. Econ. Anal. Policy 39:117–26
    [Google Scholar]
  77. Mitchell G. 2004. Empirical legal scholarship as scientific dialogue. N.C. Law Rev 83:167–204
    [Google Scholar]
  78. Miyakawa T. 2020. No raw data, no science: another possible source of the reproducibility crisis. Mol. Brain 13:24
    [Google Scholar]
  79. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH et al. 2020. The Hong Kong Principles for assessing researchers: fostering research integrity. PLOS Biol 18:e3000737
    [Google Scholar]
  80. Moher D, Liberati A, Tetzlaff J, Althman DGPRISMA Group 2009. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA Statement. PLOS Med 6:e1000097
    [Google Scholar]
  81. Moreno JA. 2017. 99 problems and the bitchin’ is one: a pragmatist's guide to student-edited law reviews. Touro Law Rev. 33:407–30
    [Google Scholar]
  82. Morse R. 2019. U.S. News considers evaluating law school scholarly impact. U.S. . News Blog Feb. 13. https://www.usnews.com/education/blogs/college-rankings-blog/articles/2019-02-13/us-news-considers-evaluating-law-school-scholarly-impact
    [Google Scholar]
  83. Munafò MR, Nosek BA, Bishop DVM, Button KS, Chambers CD et al. 2017. A manifesto for reproducible science. Nat. Hum. Behav. 1:0021
    [Google Scholar]
  84. Natl. Acad. Sci. Eng. Med 2018. Open Science by Design: Realizing a Vision for 21st Century Research Washington, DC: Natl. Acad. Press
  85. Natl. Acad. Sci. Eng. Med 2019. Reproducibility and Replicability in Science Washington, DC: Natl. Acad. Press
  86. Necker S. 2014. Scientific misbehavior in economics. Res. Policy 43:1747–59
    [Google Scholar]
  87. Nolasco CARI, Vaughn MS, del Carmen RV. 2010. Toward a new methodology for legal research in criminal justice. J. Crim. Justice Educ. 21:1–23
    [Google Scholar]
  88. Nosek BA. 2019. Strategy for culture change. Center for Open Science Blog June 11. https://www.cos.io/blog/strategy-for-culture-change
    [Google Scholar]
  89. Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD et al. 2015. Promoting an open research culture. Science 348:1422–25
    [Google Scholar]
  90. Nosek BA, Ebersole CR, DeHaven AC, Mellor DT 2018. The preregistration revolution. PNAS 115:2600–6
    [Google Scholar]
  91. Nuijten MB, Borghuis J, Veldkamp CLS, Dominguez Alvarez L, van Assem MALM et al. 2017. Journal data sharing policies and statistical reporting inconsistencies in psychology. Collabra Psychol 3:31
    [Google Scholar]
  92. Nuijten MB, Hartgerink CHJ, van Assen MALM, Epskamp S, Wicherts J. 2016. The prevalence of statistical reporting errors in psychology (1985–2013). Behav. Res. 48:1205–26
    [Google Scholar]
  93. Nyarko J. 2019. We'll see you in…court! The lack of arbitration clauses in international commercial contracts. Int. Rev. Law Econ. 58:6–24
    [Google Scholar]
  94. Obels P, Lakens D, Coles NA, Gottfried J, Green SA. 2020. Analysis of open data and computational reproducibility in Registered Reports in psychology. Adv. Meth. Pract. Psychol. Sci. 3:229–37
    [Google Scholar]
  95. Open Sci. Collab 2015. Estimating the reproducibility of psychological science. Science 349:aac4716
    [Google Scholar]
  96. Ouellette LL, Tutt A. 2020. How do patent incentives affect university researchers?. Int. Rev. Law Econ. 61:105883
    [Google Scholar]
  97. Pashler H, Harris CR. 2012. Is the replicability crisis overblown? Three arguments examined. Perspect. Psychol. Sci. 7:531–36
    [Google Scholar]
  98. Piowar HA, Vision TJ. 2013. Data reuse and the open data citation advantage. PeerJ 1:e175
    [Google Scholar]
  99. Prescott JJ, Pyle B. 2019. Identifying the impact of labor market opportunities on criminal behavior. Int. Rev. Law Econ. 59:65–81
    [Google Scholar]
  100. Racine JS. 2019. Reproducible Econometrics Using R Oxford, UK: Oxford Univ. Press
  101. Rao W. 2019. Development status and decision-making in investment treaty arbitration. Int. Rev. Law Econ. 59:1–12
    [Google Scholar]
  102. Revesz RL. 2002. A defense of empirical legal scholarship. Univ. Chicago Law Rev. 69:169–90
    [Google Scholar]
  103. Rowhani-Farid A, Barnett AG. 2018. Badges for sharing data and code at Biostatistics: an observational study. F1000Research 7:90
    [Google Scholar]
  104. Sander R. 2019. Replication of mismatch research: Ayres, Brooks and Ho. Int. Rev. Law Econ. 58:75–88
    [Google Scholar]
  105. Simmons JP, Nelson LD, Simonsohn U. 2011. False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci. 22:1359–66
    [Google Scholar]
  106. Simons DJ. 2018. Introducing advances in methods and practices in psychological science. Adv. Methods Pract. Psychol. Sci. 1:3–6
    [Google Scholar]
  107. Spellman BA, Kahneman D. 2018. What the replication reformation wrought. Behav. Brain Sci. 41:e149
    [Google Scholar]
  108. Spitzer M. 2003. Evaluating valuing empiricism (at law schools). J. Leg. Educ. 53:328–31
    [Google Scholar]
  109. Suchman MC, Mertz E. 2010. Toward a new legal empiricism: empirical legal studies and new legal realism. Annu. Rev. Law Soc. Sci. 6:555–79
    [Google Scholar]
  110. Thorley D, Mitts J. 2019. Trial by Skype: a causality-oriented replication exploring the use of remote video adjudication in immigration removal proceedings. Int. Rev. Law Econ. 59:82–97
    [Google Scholar]
  111. Vanpaemel W, Vermorgen M, Deriemaecker L, Storms G. 2015. Are we wasting a good crisis? The availability of psychological research data after the storm. Collabra 1:3
    [Google Scholar]
  112. Vazire S. 2017. Our obsession with eminence warps research. Nature 547:7
    [Google Scholar]
  113. Vazire S. 2018. Implications of the credibility revolution for productivity, creativity, and progress. Perspect. Psychol. Sci. 13:411–17
    [Google Scholar]
  114. Vazire S. 2019. A toast to the error detectors. Nature 577:9
    [Google Scholar]
  115. Vazire S, Holcombe AO. 2020. Where are the self-correcting mechanisms in science?. PsyArXiv. kgqzt. https://doi.org/10.31234/osf.io/kgqzt
    [Crossref]
  116. Vines TH, Albert AYK, Andrew RL, Débarre F, Bock DG et al. 2014. The availability of research data declines rapidly with article age. Curr. Biol. 24:94–97
    [Google Scholar]
  117. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. 2008. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. J. Clin. Epidemiol. 61:344–49
    [Google Scholar]
  118. Williams D, Sander R, Luppino M, Bolus R. 2011. Revisiting law school mismatch: a comment on Barnes (2007, 2011). Northwest. Law Rev. 105:813–28
    [Google Scholar]
  119. Yoon A. 2013. Editorial bias in legal academia. J. Leg. Anal. 5:309–38
    [Google Scholar]
  120. Zeiler K. 2016. The future of empirical legal scholarship: Where might we go from here?. J. Leg. Educ. 66:78–99
    [Google Scholar]
  121. Zwaan RA, Etz A, Lucas RE, Donnellan MB. 2018. Making replication mainstream. Behav. Brain Sci. 41:e120
    [Google Scholar]
/content/journals/10.1146/annurev-lawsocsci-121620-085055
Loading
/content/journals/10.1146/annurev-lawsocsci-121620-085055
Loading

Data & Media loading...

Supplemental Material

Supplementary Data

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error