1932

Abstract

Replication of simple and transparent experiments should promote the cumulation of knowledge. Yet, randomization alone does not guarantee simple analysis, transparent reporting, or third-party replication. This article surveys several challenges to cumulative learning from experiments and discusses emerging research practices—including several kinds of prespecification, two forms of replication, and a new model for coordinated experimental research—that may partially overcome the obstacles. I reflect on both the strengths and limitations of these new approaches to doing social science research. .

[Erratum, Closure]

An erratum has been published for this article:
Transparency, Replication, and Cumulative Learning: What Experiments Alone Cannot Achieve
Loading

Article metrics loading...

/content/journals/10.1146/annurev-polisci-042114-015939
2016-05-11
2024-12-11
Loading full text...

Full text loading...

/deliver/fulltext/polisci/19/1/annurev-polisci-042114-015939.html?itemId=/content/journals/10.1146/annurev-polisci-042114-015939&mimeType=html&fmt=ahah

Literature Cited

  1. Angrist JD, Pischke J-S. 2009. Mostly Harmless Econometrics: An Empiricist's Companion Princeton, NJ: Princeton Univ. Press [Google Scholar]
  2. Angrist JD, Pischke J-S. 2014. Mastering ‘Metrics: The Path from Cause to Effect Princeton, NJ: Princeton Univ. Press [Google Scholar]
  3. Angrist JD, Pischke J-S. 2015. Mastering metrics: teaching econometrics. VOX: CEPR's Policy Portal May 21. http://www.voxeu.org/article/mastering-metrics-teaching-econometrics [Google Scholar]
  4. Banerjee AV, Banerji R, Duflo E, Glennerster R, Khemani S. 2008. Pitfalls of participatory programs: evidence from a randomized evaluation in education in India NBER Work. Pap. 14311 [Google Scholar]
  5. Banerjee AV, Duflo E. 2009. The experimental approach to development economics. Annu. Rev. Econ. 1:151–78 [Google Scholar]
  6. Benjamini Y, Hochsberg Y. 1995. Controlling the false discovery rate: a practical and powerful approach to multiple testing. J. R. Stat. Soc. Ser. B 57:1289–300 [Google Scholar]
  7. Björkman M, Svensson J. 2009. Power to the people: evidence from a randomized field experiment on community-based monitoring in Uganda. Q. J. Econ. 124:2735–69 [Google Scholar]
  8. Brady HE, Collier D. 2010. Rethinking Social Inquiry: Diverse Tools, Shared Standards Lanham, Maryland: Rowman & Littlefield, 2nd ed.. [Google Scholar]
  9. Broockman D, Kalla J, Aronow P. 2015. Irregularities in LaCour 2014 Work. pap., Stanford Univ. http://stanford.edu/∼dbroock/broockman_kalla_aronow_lg_irregularities.pdf [Google Scholar]
  10. Cartwright N, Hardie J. 2012. Evidence-Based Policy: A Practical Guide to Doing It Better New York: Oxford Univ. Press [Google Scholar]
  11. Clemens M. 2015. The meaning of failed replications: a review and proposal Cent. Glob. Dev. Work. Pap. 399 [Google Scholar]
  12. Collier D, Brady HE, Dunning T. 2015. The set-theoretic comparative method (STCM): fundamental problems and better options Work. pap., Univ. Calif., Berkeley [Google Scholar]
  13. De Rooij EA, Green DP, Gerber AS. 2009. Field experiments on political behavior and collective action. Annu. Rev. Polit. Sci. 12:389–95 [Google Scholar]
  14. Dunning T. 2012. Natural Experiments in the Social Sciences: A Design-Based Approach New York: Cambridge Univ. Press [Google Scholar]
  15. Dunning T, Hyde S. 2014. Replicate it! A proposal to improve the study of political accountability. Monkey Cage blog. Washington Post May 16. http://www.washingtonpost.com/blogs/monkey-cage/wp/2014/05/16/replicate-it-a-proposal-to-improve-the-study-of-political-accountability [Google Scholar]
  16. Dunning T, Monestier F, Piñeiro R, Rosenblatt F, Tuñón G. 2015. Positive versus negative incentives for compliance: evaluating a randomized tax holiday in Uruguay Work. pap., Univ. Calif., Berkeley [Google Scholar]
  17. Fang A, Gordon G, Humphreys M. 2015. Does registration reduce publication bias? No evidence from medical sciences Work. pap., Columbia Univ. [Google Scholar]
  18. Freedman DA. 2009a. Statistical Models: Theory and Practice New York: Cambridge Univ. Press, 2nd ed.. [Google Scholar]
  19. Freedman DA. 2009b. Randomization does not justify logistic regression. Statistical Models and Causal Inference: A Dialogue with the Social Sciences DA Freedman, D Collier, JS Sekhon, PB Stark 219–42 New York: Cambridge Univ. Press [Google Scholar]
  20. Freedman DA, Pisani R, Purves R. 2007. Statistics. New York: W.W. Norton, 4th ed.. [Google Scholar]
  21. Gerber AS, Green DP, Nickerson DW. 2001. Testing for publication bias in political science. Polit. Anal. 9:385–92 [Google Scholar]
  22. Gerber AS, Malhotra N. 2008. Do statistical reporting standards affect what is published? Publication bias in two leading political science journals. Q. J. Polit. Sci. 3:3313–26 [Google Scholar]
  23. Gherghina S, Katsanidou A. 2013. Data availability in political science journals. Eur. Polit. Sci. 12:333–49 [Google Scholar]
  24. Green D, Gerber A. 2012. Field Experiments: Design, Analysis, Interpretation New York: W.W. Norton [Google Scholar]
  25. Hamermesh DS. 2007. Replication in economics NBER Work. Pap. No. 13026 [Google Scholar]
  26. Humphreys M, de la Sierra RS, van der Windt P. 2011. Social and economic impacts of TUUNGUANE 1: mock report http://cu-csds.org/wp-content/uploads/2011/03/20110304-MOCK-REPORT-SHARED-FOR-REGISTRATION.pdf [Google Scholar]
  27. Humphreys M. Sierra RS, van der Windt P. , de la 2013. Fishing, commitment, and communication. Polit. Anal. 21:11–20 [Google Scholar]
  28. Humphreys M, Weinstein JM. 2009. Field experiments and the political economy of development. Annu. Rev. Polit. Sci. 12:367–78 [Google Scholar]
  29. Hutchings VL, Jardina AE. 2009. Experiments on racial priming in political campaigns. Annu. Rev. Polit. Sci. 12:397–402 [Google Scholar]
  30. Hyde S. 2015. Experiments in international relations: lab, survey, and field. Annu. Rev. Polit. Sci. 18:403–24 [Google Scholar]
  31. LaCour MJ, Green DP. 2014. When contact changes minds: an experiment on transmission of support for gay equality. Science 346:62151366–69 [Google Scholar]
  32. Laitin D. 2013. Fisheries management. Polit. Anal. 21:142–47 [Google Scholar]
  33. Lieberman ES, Posner DN, Tsai LL. 2013. Does information lead to more active citizenship? Evidence from an education intervention in rural Kenya. World Dev. 60:69–83 [Google Scholar]
  34. Lin W. 2013. Agnostic notes on regression adjustment to experimental data: reexamining Freedman's critique. Ann. Appl. Stat. 7:1295–318 [Google Scholar]
  35. Malhotra N. 2014. Publication bias in political science: using TESS experiments to unlock the file drawer Presented at West Coast Exp. Conf., Claremont Graduate Univ., May 9 [Google Scholar]
  36. Mauldon J, Malvin J, Stiles J, Nicosia N, Seto E. 2000. Impact of California's Cal-Learn demonstration project: final report UC Data, Univ. Calif. Berkeley [Google Scholar]
  37. McDermott R. 2002. Experimental methods in political science. Annu. Rev. Polit. Sci. 5:31–61 [Google Scholar]
  38. McCullough BD, McGeary KA, Harrison TD. 2006. Lessons from the JMCB archive. J. Mon. Credit Banking 38:41093–107 [Google Scholar]
  39. Miguel E, Camerer C, Casey K, Cohen J, Esterling KM. et al. 2014. Promoting transparency in social science research. Science 343:30–31 [Google Scholar]
  40. Monogan JE III. 2013. A case for registering studies of political outcomes: an application in the 2010 House elections. Polit. Anal. 21:121–37 [Google Scholar]
  41. Monogan JE III. 2015. Research preregistration in political science: the case, counterarguments, and a response to critiques. PS Polit. Sci. Polit. 48:3425–29 [Google Scholar]
  42. Olken B. 2007. Monitoring corruption: evidence from a field experiment in Indonesia. J. Polit. Econ. 115:2200–49 [Google Scholar]
  43. Palfrey TR. 2009. Laboratory experiments in political economy. Annu. Rev. Polit. Sci. 12:379–88 [Google Scholar]
  44. Ragin CC. 1987. The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies Berkeley: Univ. Calif. Press [Google Scholar]
  45. Ragin CC. 2000. Fuzzy-Set Social Science Chicago: Univ. Chicago Press [Google Scholar]
  46. Rosenbaum PR. 2002. Observational Studies Springer Ser. Stat. New York: Springer-Verlag, 2nd ed.. [Google Scholar]
  47. Simonsohn U, Neilson LD, Simmons JP. 2014. P-curve: a key to the file drawer. J. Exp. Psychol. 143:2534–47 [Google Scholar]
  48. Singal J. 2015. Michael LaCour probably fabricated a document about research integrity. New York, June 1. http://nymag.com/scienceofus/2015/06/lacour-probably-fabricated-an-integrity-document.html
  49. Tanner S. 2014. QCA and causal inference: a poor match for public policy research. Qual. Multi-Method Res. 12:115–24 [Google Scholar]
/content/journals/10.1146/annurev-polisci-042114-015939
Loading
/content/journals/10.1146/annurev-polisci-042114-015939
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error