1932

Abstract

Political scientists use diverse methods to study important topics. The findings they reach and conclusions they draw can have significant social implications and are sometimes controversial. As a result, audiences can be skeptical about the rigor and relevance of the knowledge claims that political scientists produce. For these reasons, being a political scientist means facing myriad questions about how we know what we claim to know. Transparency can help political scientists address these questions. An emerging literature and set of practices suggest that sharing more data and providing more information about our analytic and interpretive choices can help others understand the rigor and relevance of our claims. At the same time, increasing transparency can be costly and has been contentious. This review describes opportunities created by, and difficulties posed by, attempts to increase transparency. We conclude that, despite the challenges, consensus about the value and practice of transparency is emerging within and across political science's diverse and dynamic research communities.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-polisci-091515-025429
2018-05-11
2024-04-19
Loading full text...

Full text loading...

/deliver/fulltext/polisci/21/1/annurev-polisci-091515-025429.html?itemId=/content/journals/10.1146/annurev-polisci-091515-025429&mimeType=html&fmt=ahah

Literature Cited

  1. AAPOR (Am. Assoc. Public Opin. Res.). 2014. AAPOR policies and procedures for transparency certification, March 12, 2014 Rep., Am. Assoc. Public Opin. Res., Oakbrook Terrace, IL. https://www.aapor.org/AAPOR_Main/media/MainSiteFiles/TI_operational_procedures_03-12-14.pdf
  2. American Political Science Association. 2012. A guide to professional ethics in political science, 2nd ed. http://www.apsanet.org/portals/54/Files/Publications/APSAEthicsGuide2012.pdf
  3. Bennett A 2015. Appendix: disciplining our conjectures: systematizing process tracing with Bayesian analysis. Process Tracing: From Metaphor to Analytic Tool A Bennett, JT Checkel 276–98 Cambridge, UK: Cambridge Univ. Press
    [Google Scholar]
  4. Bleich E, Pekkanen RJ 2013. How to report interview data. Interview Research in Political Science L Mosley 94–105 Ithaca, NY: Cornell Univ. Press
    [Google Scholar]
  5. Breuning M, Ishiyama J 2016. Implementing DA-RT principles in the American Political Science Review. Comp. Politics Newsl 26:154–57
    [Google Scholar]
  6. Büthe T, Jacobs AM 2015. Transparency in qualitative and multi-method research: a symposium. Qual. Multi-Method Res. 13:12–64
    [Google Scholar]
  7. Cantor E, Smith L 2013. Rethinking science funding. USA Today Sept. 30. https://www.usatoday.com/story/opinion/2013/09/30/cantor-gop-budget-science-spending-column/2896333/
  8. Christensen G, Miguel E 2017. Transparency, reproducibility, and the credibility of social science research. See Elman et al. 2017
  9. Dafoe A 2014. Science deserves better: the imperative to share complete replication files. PS Political Sci. Politics 47:160–66
    [Google Scholar]
  10. Desposato S 2015. Ethics and Experiments: Problems and Solutions for Social Scientists and Policy Professionals Abingdon, UK: Routledge
  11. Druckman JN, Green DP, Kuklinski JH, Lupia A 2011. Experiments: an introduction to core concepts in political science. Cambridge Handbook of Experimental Political Science JN Druckman, DP Green, JH Kuklinski, A Lupia 15–26 Cambridge, UK: Cambridge Univ. Press
    [Google Scholar]
  12. Elman C, Burton CD 2016. Research cycles: adding more substance to the spin. Perspect. Politics 14:41067–70
    [Google Scholar]
  13. Elman C, Gerring J, Mahoney J 2017. The Production of Knowledge: Enhancing Progress in Social Science Unpublished manuscript
  14. Elman C, Kapiszewski D 2014. Data access and research transparency in the qualitative tradition. PS Political Sci. Politics 47:143–47
    [Google Scholar]
  15. Elman C, Lupia A 2016. DA-RT: aspirations and anxieties. Comp. Politics Newsl. 26:144–52
    [Google Scholar]
  16. Fairfield T, Charman A 2015. Bayesian probability: the logic of (political) science Presented at Annu. Meet. Am. Political Sci. Assoc., San Francisco, Sept. 3–6
  17. Fairfield T, Charman AE 2017. Explicit Bayesian analysis for process tracing: guidelines, opportunities, and caveats. Political Anal 25:3363–80
    [Google Scholar]
  18. Fienberg SE, Martin ME, Straf ML 1985. Sharing Research Data Washington, DC: Natl. Acad. Press
  19. Flory J, Emanuel E 2004. Interventions to improve research participants’ understanding of informed consent for research: a systematic review. J. Am. Med. Assoc. 292:131593–601
    [Google Scholar]
  20. Franco A, Malhotra N, Simonovits G 2014. Publication bias in the social sciences: unlocking the file drawer. Science 345:62031502–5
    [Google Scholar]
  21. Garfinkel SL 2015. De-identification of personal information Tech. Rep. 8053, Nat Inst. Stand. Technol., US Dep. Comm Washington, DC: http://nvlpubs.nist.gov/nistpubs/ir/2015/NIST.IR.8053.pdf
  22. Gerber AS, Malhotra N 2008. Publication bias in empirical sociological research: Do arbitrary significance levels distort published results?. Sociol. Methods Res. 37:13–30
    [Google Scholar]
  23. Golder M, Golder SN 2016. Symposium: data access and research transparency. Comp. Politics Newsl. 26:11–64
    [Google Scholar]
  24. Htun M 2016. DA-RT and the social conditions of knowledge production in political science. Comp. Politics Newsl. 26:132–36.
    [Google Scholar]
  25. Humphreys M, Jacobs AM 2015. Mixing methods: a Bayesian approach. Am. Political Sci. Rev. 109:4653–73
    [Google Scholar]
  26. Ioannidis JP 2005. Why most published research findings are false. PLOS Med 2:8e124
    [Google Scholar]
  27. Ishiyama J 2014. Replication, research transparency, and journal publications: individualism, community models, and the future of replication studies. PS Political Sci. Politics 47:178–83
    [Google Scholar]
  28. Jacobs AM 2017. On bias and blind selection: assessing the promise of pre-registration and results-free review. See Elman et al. 2017
  29. Kahan D 2017. On the sources of ordinary science knowledge and extraordinary science ignorance. The Oxford Handbook of the Science of Science Communication KH Jamieson, D Kahan, D Schufele 31–49 Oxford, UK: Oxford Univ. Press
    [Google Scholar]
  30. Kidwell MC, Lazarević LB, Baranski E, Hardwicke TE, Piechowski S et al. 2016. Badges to acknowledge open practices: a simple, low-cost, effective method for increasing transparency. PLOS Biol 14:5e1002456
    [Google Scholar]
  31. King G 1995. Replication, replication. PS Political Sci. Politics 28:444–52
    [Google Scholar]
  32. Lieberman ES 2016. Can the biomedical research cycle be a model for political science. ? Perspect. Politics 14:41054–66
    [Google Scholar]
  33. Lupia A 2017. Now is the time: how to increase the value of social science. Soc. Res. Int. Q. 84:3689–715
    [Google Scholar]
  34. Lupia A, Alter G 2014. Data access and research transparency in the quantitative tradition. PS Political Sci. Politics 47:154–59
    [Google Scholar]
  35. Lupia A, Elman C 2014. Openness in political science: data access and research transparency—introduction. PS Political Sci. Politics 47:119–42
    [Google Scholar]
  36. Mahoney J 2012. The logic of process tracing tests in the social sciences. Sociol. Methods Res. 41:4570–97
    [Google Scholar]
  37. Miguel E, Camerer C, Casey K, Cohen J, Esterling KM et al. 2014. Promoting transparency in social science research. Science 343:616630–31
    [Google Scholar]
  38. Moravcsik A 2014.a Transparency: the revolution in qualitative research. PS Political Sci. Politics 47:148–53
    [Google Scholar]
  39. Moravcsik A 2014.b Trust, but verify: the transparency revolution and qualitative international relations. Secur. Stud. 23:4663–88
    [Google Scholar]
  40. Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD et al. 2015. Promoting an open research culture: Author guidelines for journals could help to promote transparency, openness, and reproducibility. Science 348:1422–25
    [Google Scholar]
  41. Nyhan B 2015. Increasing the credibility of political science research: a proposal for journal reforms. PS Political Sci. Politics. Spec. Iss:78–83
    [Google Scholar]
  42. Pearl J 2009. Causal inference in statistics: an overview. Stat. Surv. 3:96–146
    [Google Scholar]
  43. Romney D, Stewart BM, Tingley D 2015. Plain text? Transparency in computer-assisted text analysis. Qual. Multi-Method Res. 13:132–38
    [Google Scholar]
  44. Rowhani-Farid A, Allen M, Barnett AG 2017. What incentives increase data sharing in health and medical research?. Rev. Res Integr. Peer Rev. 2:4
    [Google Scholar]
  45. Rubin D 1974. Estimating causal effects of treatments in randomized and nonrandomized studies. J. Educ. Psychol. 66:688–701
    [Google Scholar]
  46. Schwartz-Shea P, Yanow D 2016. Legitimizing political science or splitting the discipline? Reflections on DA-RT and the policy-making role of a professional association. Politics Gend 12:3E11
    [Google Scholar]
  47. Trachtenberg M 2015. Transparency in practice: using written sources. Qual. Multi-Method Res. 13:113–17
    [Google Scholar]
  48. Waldner D 2015. What makes process tracing good? Causal mechanisms, causal inference, and the completeness standard in comparative politics. Process Tracing: From Metaphor to Analytic Tool A Bennett, JT Checkel 126–52 Cambridge, UK: Cambridge Univ. Press
    [Google Scholar]
  49. Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M et al. 2016. The FAIR guiding principles for scientific data management and stewardship. Sci. Data 3:160018
    [Google Scholar]
/content/journals/10.1146/annurev-polisci-091515-025429
Loading
/content/journals/10.1146/annurev-polisci-091515-025429
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error