1932

Abstract

Political scientists are fielding more and more surveys in the developing world. Yet, most survey research methodology derives from experiences in developed countries. Researchers working in the developing world often confront very different challenges to collecting high-quality data. Census data may be unreliable or outdated, enumerators may shirk, political topics may be sensitive, and respondents may be unaccustomed to and uncomfortable with an interview format. In this article, we review both published methodological research and the best practices of scholars based on an original expert survey of survey researchers. We characterize the state of the field and provide insights about the range of available options for implementing surveys in the developing world. We examine and assess innovations across many aspects of survey implementation, including sampling, enumeration, data collection, ethical considerations, and reporting. We also offer suggestions for future methodological inquiry and for greater research transparency.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-polisci-052115-021432
2018-05-11
2024-06-18
Loading full text...

Full text loading...

/deliver/fulltext/polisci/21/1/annurev-polisci-052115-021432.html?itemId=/content/journals/10.1146/annurev-polisci-052115-021432&mimeType=html&fmt=ahah

Literature Cited

  1. AAPOR. 2016. Standard definitions: final dispositions of case codes and outcome rates for surveys Rep., 9th Am. Assoc. Public Opin. Res Oakbrook Terrace, IL: http://www.aapor.org/AAPOR_Main/media/publications/Standard-Definitions20169theditionfinal.pdf
    [Google Scholar]
  2. Adida CL, Feree KE, Posner DN, Robinson AL 2016. Who's asking? Interviewer coethnicity effects in African survey data. Comp. Political Stud. 49:1630–60
    [Google Scholar]
  3. Aronow PM, Coppock A, Crawford FW, Green DP 2015. Combining list experiment and direct question estimates of sensitive behavior prevalence. J. Surv. Stat. Methodol. 3:43–66
    [Google Scholar]
  4. Banducci S, Stevens D 2015. Surveys in context: how timing in the electoral cycle influences response propensity and satisficing. Public Opin. Q. 79:214–43
    [Google Scholar]
  5. Bauer JJ 2014. Selection errors of random route samples. Sociol. Methods Res. 43:519–44
    [Google Scholar]
  6. Bauer JJ 2016. Biases in random route surveys. J. Surv. Stat. Methodol. 4:263–87
    [Google Scholar]
  7. Beber B, Roessler P, Scacco A 2014. Intergroup violence and political attitudes: evidence from a dividing Sudan. J. Politics 76:649–65
    [Google Scholar]
  8. Benstead LJ 2014.a Does interviewer religious dress affect survey responses? Evidence from Morocco. Politics Religion 7:734–60
    [Google Scholar]
  9. Benstead LJ 2014.b Effects of interviewer–respondent gender interaction on attitudes toward women and politics: findings from Morocco. Int. J. Public Opin. Res. 26:369–83
    [Google Scholar]
  10. Bischoping K, Schuman H 1992. Pens and polls in Nicaragua: an analysis of the 1990 preelection surveys. Am. J. Political Sci. 36:331–50
    [Google Scholar]
  11. Blair G, Imai K 2012. Statistical analysis of list experiments. Political Anal 20:47–77
    [Google Scholar]
  12. Blaydes L, Gillum RM 2013. Religiosity-of-interviewer effects: assessing the impact of veiled enumerators on survey response in Egypt. Politics Religion 6:459–82
    [Google Scholar]
  13. Bleck J, Michelitch K 2017. Capturing the airwaves, capturing the nation? A field experiment on state-run media effects in the wake of a coup. J. Politics 79:873–89
    [Google Scholar]
  14. Blom M, Hox J, Koch A 2007. The influence of interviewers’ contact behavior on the contact and cooperation rate in face-to-face household surveys. Int. J. Public Opin. Res. 19:97–111
    [Google Scholar]
  15. Brislin RW 1970. Back-translation for cross-cultural research. J. Cross-Cult. Psychol. 1:185–216
    [Google Scholar]
  16. Chauchard S 2013. Using MP3 players in surveys: the impact of a low-tech self-administration mode on reporting of sensitive attitudes. Public Opin. Q. 77:220–31
    [Google Scholar]
  17. Chia SC 2014. How authoritarian social contexts inform individuals’ opinion perception and expression. Int. J. Public Opin. Res. 26:384–96
    [Google Scholar]
  18. Cilliers J, Dube O, Siddiqi B 2015. The white-man effect: how foreigner presence affects behavior in experiments. J. Econ. Behav. Organ. 118:397–414
    [Google Scholar]
  19. Clayton A 2015. Women's political engagement under quota-mandated female representation: evidence from a randomized policy experiment. Comp. Political Stud. 48:333–69
    [Google Scholar]
  20. Corstange D 2009. Sensitive questions, truthful answers? Modeling the list experiment with Listit. Political Anal 17:45–63
    [Google Scholar]
  21. Davidov E, De Beuckelaer A 2010. How harmful are survey translations? A test with Schwartz's human values instrument. Int. J. Public Opin. Res. 22:485–510
    [Google Scholar]
  22. Davis DW, Silver BD 2003. Stereotype threat and race of interviewer effects in a survey on political knowledge. Am. J. Political Sci. 47:33–45
    [Google Scholar]
  23. Diop A, Le KT, Traugott M 2015. Third-party presence effect with propensity score matching. J. Surv. Stat. Methodol. 3:193–215
    [Google Scholar]
  24. Driscoll J, Lidow N 2014. Representative surveys in insecure environments: a case study of Mogadishu, Somalia. J. Surv. Stat. Methodol. 2:78–95
    [Google Scholar]
  25. Druckman JN, Green DP, Kuklinski JH, Lupia A 2006. The growth and development of experimental research in political science. Am. Political Sci. Rev. 100:627–35
    [Google Scholar]
  26. Durrant GB, Groves RM, Staetsky L, Steele F 2010. Effects of interviewer attitudes and behaviors on refusal in household surveys. Public Opin. Q. 74:1–36
    [Google Scholar]
  27. Flores-Macías F, Lawson C 2008. Effects of interviewer gender on survey responses: findings from a household survey in Mexico. Int. J. Public Opin. Res. 20:100–10
    [Google Scholar]
  28. Fujii LA 2012. Research ethics 101: dilemmas and responsibilities. PS Political Sci. Politics 45:717–23
    [Google Scholar]
  29. Gelman A 2016. Can you trust international surveys. ? Monkey Cage Blog Feb. 27. https://www.washingtonpost.com/news/monkey-cage/wp/2016/02/27/can-you-trust-international-surveys
    [Google Scholar]
  30. Gingerich DW 2010. Understanding off-the-books politics: conducting inference on the determinants of sensitive behavior with randomized response surveys. Political Anal 18:349–80
    [Google Scholar]
  31. Gingerich DW, Oliveros V, Corbacho A, Ruiz-Vega M 2016. When to protect? Using the crosswise model to integrate protected and direct responses in surveys of sensitive behavior. Political Anal 24:132–56
    [Google Scholar]
  32. Glynn A 2013. What can we learn with statistical truth serum? Design and analysis of the list experiment. Public Opin. Q. 77:159–72
    [Google Scholar]
  33. Gomila R, Littman R, Blair G, Paluck EL 2017. The audio check: a method for improving data quality and detecting data fabrication. Soc. Psychol. Pers. Sci. 8:424–33
    [Google Scholar]
  34. Gonzalez-Ocantos E, de Jonge CK, Meléndez C, Osorio J, Nickerson DW 2012. Vote buying and social desirability bias: experimental evidence from Nicaragua. Am. J. Political Sci. 56:202–17
    [Google Scholar]
  35. Groves RM, Couper MP 1998. Nonresponse in Household Interview Surveys New York: Wiley
    [Google Scholar]
  36. Groves RM, Couper MP, Presser S, Singer E, Tourangeau R et al. 2006. Experiments in producing nonresponse bias. Public Opin. Q. 70:720–36
    [Google Scholar]
  37. Groves RM, Dillman DA, Eltinge JL, Little RJA 2002. Survey Nonresponse New York: Wiley
    [Google Scholar]
  38. Groves RM, Fowler FJ Jr., Couper MP, Lepkowski JM, Singer E, Tourangeau R 2009. Survey Methodology Hoboken, NJ: Wiley. , 2nd ed..
    [Google Scholar]
  39. Groves RM, Peytcheva E 2008. The impact of nonresponse rates on nonresponse bias: a meta-analysis. Public Opin. Q. 72:167–89
    [Google Scholar]
  40. Harkness JA 1999. In pursuit of quality: issues for cross-national survey research. Int. J. Soc. Res. Methodol. 2:125–40
    [Google Scholar]
  41. Harkness JA 2007. Improving the comparability of translations. Measuring Attitudes Cross-Nationally: Lessons from the European Social Survey R Jowell, C Roberts, R Fitzgerald, G Eva 79–93 Thousand Oaks, CA: Sage
    [Google Scholar]
  42. Harkness JA, Braun M, Edwards B, Johnson TP, Lyberg LE et al., eds. 2010. Survey Methods in Multicultural, Multinational, and Multiregional Contexts New York: Wiley
    [Google Scholar]
  43. Heath A, Fisher S, Smith S 2005. The globalization of public opinion research. Annu. Rev. Political Sci. 8:297–333
    [Google Scholar]
  44. Heath A, Martin J, Spreckelsen T 2009. Cross-national comparability of survey attitude measures. Int. J. Public Opin. Res. 21:293–315
    [Google Scholar]
  45. Hillygus DS 2009. The need for survey reporting standards in political science. The Future of Political Science G King, K Lehman Schlozman, N Nie 88–90 New York: Routledge
    [Google Scholar]
  46. Himelein K 2016. Interviewer effects in subjective survey questions: evidence from Timor-Leste. Int. J. Public Opin. Res. 28:511–33
    [Google Scholar]
  47. Iannacchione VG 2011. The changing role of address-based sampling in survey research. Public Opin. Q. 75:556–75
    [Google Scholar]
  48. Imai K, Park B, Greene KF 2015. Using the predicted responses from list experiments as explanatory variables in regression models. Political Anal 23:180–96
    [Google Scholar]
  49. Jann B, Jerke J, Krumpal I 2012. Asking sensitive questions using the crosswise model: an experimental survey measuring plagiarism. Public Opin. Q. 76:32–49
    [Google Scholar]
  50. Johnson TP, Pennell BE, Stoop I, Dorer B 2018. Advances in Comparative Survey Methods: Multicultural, Multinational and Multiregional Contexts New York: Wiley. In press
    [Google Scholar]
  51. Johnston R 2006. Party identification: unmoved mover or sum of preferences. ? Annu. Rev. Political Sci. 9:329–51
    [Google Scholar]
  52. Kalton G 1983. Introduction to Survey Sampling Newbury Park, CA: Sage
    [Google Scholar]
  53. Kalton G, Flores-Cervantes I 2003. Weighting methods. J. Off. Stat. 19:81–97
    [Google Scholar]
  54. Koczela S, Furlong C, McCarthy J, Mushtaq A 2015. Curbstoning and beyond: confronting data fabrication in survey research. Stat. J. IAOS 31:413–22
    [Google Scholar]
  55. Krosnick J 1991. Response strategies for coping with the cognitive demands of attitude measures in surveys. Appl. Cogn. Psychol. 5:213–36
    [Google Scholar]
  56. Kuriakose N, Robbins M 2016. Don't get duped: fraud through duplication in public opinion surveys. Stat. J. IAOS 32:283–91
    [Google Scholar]
  57. Landry PF 2010. Using clustered spatial data to study diffusion: the case of legal institutions in China. Contemporary Chinese Politics: New Sources, Methods, and Field Strategies A Carlson, ME Gallagher, K Lieberthal, M Manion 219–45 New York: Cambridge Univ. Press
    [Google Scholar]
  58. Landry PF, Shen M 2005. Reaching migrants in survey research: the use of the global positioning system to reduce coverage bias in China. Political Anal 13:1–22
    [Google Scholar]
  59. Lee T, Perez EO 2014. The persistent connection between language-of-interview and Latino political opinion. Political Behav 36:401–25
    [Google Scholar]
  60. Legleye S, Charrance G, Razafindratsima N, Bohet A, Bajos N, Moreau C 2013. Improving survey participation: cost effectiveness of callbacks to refusals and increased call attempts in a national telephone survey in France. Public Opin. Q. 77:666–95
    [Google Scholar]
  61. Liu M, Stainback K 2013. Interviewer gender effects on survey responses to marriage-related questions. Public Opin. Q. 77:606–18
    [Google Scholar]
  62. Lohr SL 2010. Sampling: Design and Analysis Boston: Cengage Learning
    [Google Scholar]
  63. Lowes S, Nunn N, Robinson JA, Weigel J 2015. Understanding ethnic identity in Africa: evidence from the Implicit Association Test (IAT). Am. Econ. Rev. 105:340–45
    [Google Scholar]
  64. Lupia A 2015. Uninformed: Why People Know So Little About Politics and What We Can Do About It New York: Oxford Univ. Press
    [Google Scholar]
  65. Lupu N, Peisakhin L 2017. The legacy of political violence across generations. Am. J. Political Sci. 61:836–51
    [Google Scholar]
  66. Lyall J, Shiraito Y, Imai K 2015. Coethnic bias and wartime informing. J. Politics 77:833–48
    [Google Scholar]
  67. Manion M 2010. A survey of survey research on Chinese politics: What have we learned?. Contemporary Chinese Politics: New Sources, Methods, and Field Strategies A Carlson, ME Gallagher, K Lieberthal, M Manion 181–99 New York: Cambridge Univ. Press
    [Google Scholar]
  68. Massey DS, Tourangeau R 2013. New challenges to social measurement. Ann. Am. Acad. Political Soc. Sci. 645:6–22
    [Google Scholar]
  69. McGorry SY 2000. Measurement in a cross-cultural environment: survey translation issues. Qual. Market Res. 3:74–81
    [Google Scholar]
  70. Montalvo D, Seligson MA, Zechmeister EJ 2018. Data collection in cross-national and international surveys: Latin America and the Caribbean. See Johnson et al. 2018. In press
  71. Mutz DC 2011. Population-Based Survey Experiments Princeton, NJ: Princeton Univ. Press
    [Google Scholar]
  72. Oliveros V 2016. Making it personal: clientelism, favors, and the personalization of public administration in Argentina. Comp. Politics 48:373–91
    [Google Scholar]
  73. Olson K, Peytchev A 2007. Effect of interviewer experience on interview pace and interviewer attitudes. Public Opin. Q. 71:273–86
    [Google Scholar]
  74. Paluck EL 2009. Methods and ethics with research teams and NGOs: comparing experiences across the border of Rwanda and Democratic Republic of Congo. Surviving Field Research: Working in Violent and Difficult Situations CL Sriram, JC King, JA Mertus, O Martin-Ortega, J Herman 38–56 New York: Routledge
    [Google Scholar]
  75. Peytchev A 2009. Survey breakoff. Public Opin. Q. 73:74–97
    [Google Scholar]
  76. Robbins M 2016. Yes, you can trust international surveys. Mostly. Monkey Cage Blog Mar. 26. https://www.washingtonpost.com/news/monkey-cage/wp/2016/03/26/yes-you-can-trust-international-surveys-mostly
    [Google Scholar]
  77. Robbins M 2018. New frontiers in detecting fabrication. See Johnson et al. 2018. In press
  78. Schaeffer NC, Dykema J, Maynard DW 2010. Interviewers and interviewing. Handbook of Survey Research PV Marsden, JD Wright 437–70 Bingley, UK: Emerald Group. , 2nd ed..
    [Google Scholar]
  79. Seligson MA 2008. Human subjects protection and large-N research: when exempt is non-exempt and research is non-research. PS Political Sci. Politics 41:477–82
    [Google Scholar]
  80. Simmons K, Mercer A, Schwarzer S, Kennedy C 2016. Evaluating a new proposal for detecting data falsification in surveys: the underlying causes of “high matches” between survey respondents. Stat. J. IAOS 32:327–38
    [Google Scholar]
  81. Singer E, Couper MP 2008. Do incentives exert undue influence on survey participation? Experimental evidence. J. Empir. Res. Hum. Res. Ethics 3:49–56
    [Google Scholar]
  82. Singer E, Hoewyk JV, Maher MP 1998. Does the payment of incentives create expectation effects. ? Public Opin. Q. 62:152–64
    [Google Scholar]
  83. Sinibaldi J, Durrant GB, Kreuter F 2013. Evaluating the measurement error of interviewer observed paradata. Public Opin. Q. 77:173–93
    [Google Scholar]
  84. Slomczynski KM, Powałko P, Krauze T 2017. Non-unique records in international survey projects: the need for extending data quality control. Surv. Res. Methods 11:1–16
    [Google Scholar]
  85. Smith TW 1997. The impact of the presence of others on a respondent's answers to questions. Int. J. Public Opin. Res. 9:33–47
    [Google Scholar]
  86. Smith TW 2009.a Codes of ethics and standards in survey research. The SAGE Handbook of Public Opinion Research W Donsbach, MW Traugott 459–67 Thousand Oaks, CA: Sage
    [Google Scholar]
  87. Smith TW 2009.b A revised review of methods to estimate the status of cases with unknown eligibility Rep Am. Assoc. Public Opin. Res Oakbrook Terrace, IL: https://www.aapor.org/AAPOR_Main/media/MainSiteFiles/FindingE.pdf
    [Google Scholar]
  88. Smith TW 2010. Surveying across nations and cultures. Handbook of Survey Research PV Marsden, JD Wright 733–63 Bingley, UK: Emerald Group. , 2nd ed..
    [Google Scholar]
  89. Smith TW 2015. Resources for conducting cross-national survey research. Public Opin. Q. 79:404–9
    [Google Scholar]
  90. SRC. 2016. Guidelines for best practice in cross-cultural surveys Rep., 4th ed., Surv. Res. Cent., Inst. Soc. Res., Univ. Mich. Ann Arbor, MI: http://ccsg.isr.umich.edu/images/PDFs/CCSG_Full_Guidelines_2016_Version.pdf
    [Google Scholar]
  91. Tourangeau R, Edwards B, Johnson TP, Wolter KM, Bates N 2014. Hard-To-Survey Populations New York: Cambridge Univ. Press
    [Google Scholar]
  92. Tourangeau R, Smith TW 1996. Asking sensitive questions: the impact of data collection mode, question format, and question context. Public Opin. Q. 60:275–304
    [Google Scholar]
  93. Tourangeau R, Yan T 2007. Sensitive questions in surveys. Psychol. Bull. 133:859–83
    [Google Scholar]
  94. Tsai LL 2010. Quantitative research and issues of political sensitivity in rural China. Contemporary Chinese Politics: New Sources, Methods, and Field Strategies A Carlson, ME Gallagher, K Lieberthal, M Manion 246–65 New York: Cambridge Univ. Press
    [Google Scholar]
  95. West BT, Blom AG 2017. Explaining interviewer effects: a research synthesis. J. Surv. Stat. Methodol. 5:175–211
    [Google Scholar]
  96. West BT, Kreuter F 2013. Factors affecting the accuracy of interviewer observations: evidence from the National Survey of Family Growth. Public Opin. Q. 77:522–48
    [Google Scholar]
  97. West BT, Olson K 2010. How much of interviewer variance is really nonresponse error variance. ? Public Opin. Q. 74:1004–26
    [Google Scholar]
  98. Zechmeister EJ 2016. Ethics and research in political science: the responsibilities of the researcher and the profession. Ethics and Experiments: Problems and Solutions for Social Scientists and Policy Professionals S Desposato 255–61 New York: Routledge
    [Google Scholar]
  99. Zechmeister EJ, Seligson MA 2012. Public opinion research in Latin America. Routledge Handbook of Latin American Politics P Kingstone, DJ Yashar 467–82 New York: Routledge
    [Google Scholar]
/content/journals/10.1146/annurev-polisci-052115-021432
Loading
/content/journals/10.1146/annurev-polisci-052115-021432
Loading

Data & Media loading...

Supplementary Data

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error