1932

Abstract

I review selected articles from the survey methodology literature on the consequences of asking sensitive questions in censuses and surveys, using a total survey error (TSE) framework. I start with definitions of sensitive questions and move to examination of the impact of including sensitive questions on various sources of survey error—specifically, survey respondents’ willingness to participate in a survey (unit nonresponse), their willingness to respond to next rounds of interviews (wave nonresponse), their likelihood to provide an answer to sensitive questions after agreeing to participate in the survey (item nonresponse), and the accuracy of respondents’ answers to sensitive questions (measurement error). I also review the simultaneous impact of sensitive questions on multiple sources of error in survey estimates and discuss strategies to mitigate the impact of asking sensitive questions on measurement errors. I conclude with a summary and suggestions for future research.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-statistics-040720-033353
2021-03-07
2024-07-15
Loading full text...

Full text loading...

/deliver/fulltext/statistics/8/1/annurev-statistics-040720-033353.html?itemId=/content/journals/10.1146/annurev-statistics-040720-033353&mimeType=html&fmt=ahah

Literature Cited

  1. Acquisti A, John LK, Lowewenstein G 2012. The impact of relative standards on the propensity to disclose. J. Mark. Res. 49:160–74
    [Google Scholar]
  2. Aguinis H, Pierce CA, Quigley BM 1995. Enhancing the validity of self-reported alcohol and marijuana consumption using a bogus pipeline procedure: a meta-analytic review. Basic Appl. Soc. Psychol. 16:515–27
    [Google Scholar]
  3. Andreenkova AV, Javeline D. 2019. Sensitive questions in comparative surveys. Advances in Comparative Survey Methods: Multinational, Multiregional, and Multicultural Contexts (3MC) TP Johnson, B Pennell, IAL Stoop, B Dorer 139–60 New York: Wiley
    [Google Scholar]
  4. Beatty P, Herrmann D. 2002. To answer or not to answer: decision processes related to survey item nonresponse. Survey Nonresponse RM Groves, DA Dillman, JL Eltinge, RJA Little 71–85 New York: Wiley
    [Google Scholar]
  5. Belli RF, Moore SE, VanHoewyk J 2006. An experimental comparison of question forms used to reduce vote overreporting. Electoral Stud 25:751–59
    [Google Scholar]
  6. Belli RF, Traugott MW, Young M, McGonagle KA 1999. Reducing vote overreporting in surveys: social desirability, memory failure, and source monitoring. Public Opin. Q. 63:90–108
    [Google Scholar]
  7. Brown JD, Heggeness ML, Dorinski DM, Warren L, Yi M 2018. Understanding the quality of alternative citizenship data sources for the 2020 Census Tech. Rep., US Census Bur. Washington, DC:
    [Google Scholar]
  8. Brown JD, Heggeness ML, Dorinski DM, Warren L, Yi M 2019. Predicting the effect of adding a citizenship question to the 2020 Census. Demography 56:1173–94
    [Google Scholar]
  9. Couper MP, Singer E, Conrad FG, Groves RM 2008. Risk of disclosure, perceptions of risk, and concerns about privacy and confidentiality as factors in survey participation. J. Off. Stat. 24:255–75
    [Google Scholar]
  10. Couper MP, Singer E, Conrad FG, Groves RM 2010. Experimental studies of disclosure risk, disclosure harm, topic sensitivity, and survey participation. J. Off. Stat. 26:287–300
    [Google Scholar]
  11. De Jonge CPK. 2015. Who lies about electoral gifts? Experimental evidence from Latin America. Public Opin. Q. 79:710–39
    [Google Scholar]
  12. de Leeuw ED. 1992. Data Quality in Mail, Telephone and Face to Face Surveys Amsterdam: T.T. Publikaties
    [Google Scholar]
  13. Diop A, Le KT, Traugott M 2015. Third-party presence effect with propensity score matching. J. Surv. Stat. Methodol. 3:193–215
    [Google Scholar]
  14. Duff B, Hanmer MJ, Park W, White IK 2007. Good excuses: understanding who votes with an improved turnout question. Public Opin. Q. 71:67–90
    [Google Scholar]
  15. Fu H, Darroch JE, Henshaw SK, Kolb E 1998. Measuring the extent of abortion underreporting in the 1995 National Survey of Family Growth. Fam. Plan. Perspect. 30:128–38
    [Google Scholar]
  16. Gnambs T, Kaspar K. 2015. Disclosure of sensitive behaviors across self-administered survey modes: a meta-analysis. Behav. Res. Methods 47:1237–59
    [Google Scholar]
  17. Groves RM. 1989. Survey Errors and Survey Costs New York: Wiley
    [Google Scholar]
  18. Groves RM, Couper MP. 1998. Nonresponse in Household Interview Surveys New York: Wiley
    [Google Scholar]
  19. Groves RM, Couper MP, Presser S, Singer E, Tourangeau R et al. 2006. Experiments in producing nonresponse bias. Public Opin. Q. 70:720–36
    [Google Scholar]
  20. Groves RM, Presser S, Dipko S 2004. The role of topic interest in survey participation decisions. Public Opin. Q. 68:2–31
    [Google Scholar]
  21. Groves RM, Presser S, Tourangeau R, West BT, Couper MP et al. 2012. Support for the survey sponsor and nonresponse bias. Public Opin. Q. 76:512–24
    [Google Scholar]
  22. Groves RM, Singer E, Corning A 2000. Leverage-salience theory of survey participation: description and an illustration. Public Opin. Q. 64:299–308
    [Google Scholar]
  23. Hanmer MJ, Banks AJ, White IK 2014. Experiments to reduce the over-reporting of voting: a pipeline to the truth. Political Anal 22:130–41
    [Google Scholar]
  24. Herrera AV, Benjet C, Méndez E, Casanova L, Medina-Mora ME 2017. How mental health interviews conducted alone, in the presence of an adult, a child or both affects adolescents’ reporting of psychological symptoms and risky behaviors. J. Youth Adolesc. 46:417–28
    [Google Scholar]
  25. Holbrook AL, Krosnick JA. 2010. Measuring voter turnout by using the randomized response technique: evidence calling into question the method's validity. Public Opin. Q. 74:328–43
    [Google Scholar]
  26. Holbrook AL, Krosnick JA. 2013. A new question sequence to measure voter turnout in telephone surveys: results of an experiment in the 2006 ANES pilot study. Public Opin. Q. 77:106–23
    [Google Scholar]
  27. Holtgraves T, Eck J, Lasky B 1997. Face management, question wording, and social desirability. J. Appl. Soc. Psychol. 27:1650–71
    [Google Scholar]
  28. Jagannathan R. 2001. Relying on surveys to understand abortion behavior: some cautionary evidence. Am. J. . Public Health 91:1825–31
    [Google Scholar]
  29. Johnson TP. 2014. Sources of error in substance use prevalence surveys. Int. Sch. Res. Notices 2014:923290
    [Google Scholar]
  30. Johnson TP, O'Rourke D, Chavez N, Sudman S, Warnecke R et al. 1997. Social cognition and responses to survey questions among culturally diverse populations. Survey Measurement and Process Quality LE Lyberg, PP Biemer, M Collins, ED de Leeuw, C Dippo, et al. 87–114 New York: Wiley
    [Google Scholar]
  31. Johnson TP, van de Vijver FJR 2002. Social desirability in cross-cultural research. Cross-Cultural Survey Methods JA Harkness, FJR van de Vijver, PPH Mohler 193–209 New York: Wiley
    [Google Scholar]
  32. Jones E, Forrest JD. 1992. Underreporting of abortion in surveys of U.S. women: 1976 to 1988. Demography 29:113–26
    [Google Scholar]
  33. Jones RK, Kost K. 2007. Underreporting of induced and spontaneous abortion in the United States: an analysis of the 2002 National Survey of Family Growth. Stud. Fam. Plan. 38:187–97
    [Google Scholar]
  34. Junger M. 1989. Discrepancies between police and self-report data for Dutch racial minorities. Br. J. Criminol. 29:273–84
    [Google Scholar]
  35. Kreuter F, Presser S, Tourangeau R 2008. Social desirability bias in CATI, IVR, and web surveys. Public Opin. Q. 72:847–65
    [Google Scholar]
  36. Kritzinger S, Schwarzer S, Zeglovits E 2012. Reducing overreporting of voter turnout in seven European countries—results from a survey experiment Paper presented at the Annual Conference of the American Association for Public Opinion Research, Orlando, Florida
    [Google Scholar]
  37. Krumpal I. 2013. Determinants of social desirability bias in sensitive surveys: a literature review. Qual. Quantity 47:2025–47
    [Google Scholar]
  38. Kuhn PM, Vivyan N. 2018. Reducing turnout misreporting in online surveys. Public Opin. Q. 82:300–21
    [Google Scholar]
  39. Lensvelt-Mulders GJLM, Hox JJ, van der Heijden PGM, Maas CJM 2005. Meta-analysis of randomized response research: thirty-five years of validation. Sociol. Methods Res. 33:319–48
    [Google Scholar]
  40. Lind LH, Schober MF, Conrad FG, Reichert H 2013. Why do survey respondents disclose more when computers ask the questions. ? Public Opin. Q. 77:888–935
    [Google Scholar]
  41. Lipps O. 2007. Attrition in the Swiss Household Panel. Methoden Daten Anal 1:45–68
    [Google Scholar]
  42. Loosvedt G, Pickery J, Billiet J 2002. Item nonresponse as a predictor of unit nonresponse in a panel survey. J. Off. Stat. 18:545–57
    [Google Scholar]
  43. McDonald B, Haardoerfer R, Windle M, Goodman M, Berg C 2017. Implications of attrition in a longitudinal web-based survey: an examination of college students participating in a tobacco use study. JMIR Public Health Surveill 3:4e73
    [Google Scholar]
  44. McDonald JA, Scott ZA, Hanmer MJ 2017. Using self-prophecy to combat vote overreporting on public opinion surveys. Electoral Stud 50:137–41
    [Google Scholar]
  45. Ong A, Weiss DJ. 2000. The impact of anonymity on responses to sensitive questions. J. Appl. Soc. Psychol. 30:1691–708
    [Google Scholar]
  46. Persson M, Solevid M. 2014. Measuring political participation—testing social desirability bias in a web-survey experiment. Int. J. Public Opin. Res. 26:98–112
    [Google Scholar]
  47. Peter J, Valkenburg PM. 2011. The impact of “forgiving” introductions on the reporting of sensitive behavior in surveys: the role of social desirability response style and developmental status. Public Opin. Q. 75:779–87
    [Google Scholar]
  48. Preisendörfer P, Wolter F. 2014. Who is telling the truth? A validation study on determinants of response behavior in surveys. Public Opin. Q. 78:126–46
    [Google Scholar]
  49. Rasinski KA, Visser PS, Zagatsky M, Rickett EM 2005. Using implicit goal priming to improve the quality of self-report data. J. Exp. Soc. Psychol. 41:321–27
    [Google Scholar]
  50. Régnier-Loilier A, Guisse N. 2012. Sample attrition and distortion over the waves of the French Generations and Gender Survey Tech. Pap., Inst. Natl. Études Démogr., Auberviliers, Fr.
    [Google Scholar]
  51. Rosenfeld B, Imai K, Shapiro J 2016. An empirical validation study of popular survey methodologies for sensitive questions. Am. J. Political Sci. 60:783–802
    [Google Scholar]
  52. Sakshaug JW, Yan T, Tourangeau R 2010. Nonresponse error, measurement error, and mode of data collection: tradeoffs in a multi-mode survey of sensitive and non-sensitive items. Public Opin. Q. 74:907–33
    [Google Scholar]
  53. Shoemaker PJ, Eichholz M, Skewes EA 2002. Item nonresponse: distinguishing between don't know and refuse. Int. J. Public Opin. Res. 14:193–201
    [Google Scholar]
  54. Tierney KI. 2019. Abortion underreporting in Add Health: findings and implications. Popul. Res. Policy Rev. 38:417–28
    [Google Scholar]
  55. Tourangeau R, Groves RM, Redline CD 2010. Sensitive topics and reluctant respondents: demonstrating a link between nonresponse bias and measurement error. Public Opin. Q. 74:413–32
    [Google Scholar]
  56. Tourangeau R, Rasinski K, Jobe J, Smith TW, Pratt W 1997a. Sources of error in a survey of sexual behavior. J. Off. Stat. 13:341–65
    [Google Scholar]
  57. Tourangeau R, Rips LJ, Rasinski KA 2000. The Psychology of Survey Response Cambridge, UK: Cambridge Univ. Press
    [Google Scholar]
  58. Tourangeau R, Smith TW, Rasinski KA 1997b. Motivation to report sensitive behaviors in surveys: evidence from a bogus pipeline experiment. J. Appl. Soc. Psychol. 27:209–22
    [Google Scholar]
  59. Tourangeau R, Yan T. 2007. Sensitive questions in surveys. Psychol. Bull. 133:859–83
    [Google Scholar]
  60. Udry JR, Gaughan M, Schwingl PJ, van den Berg BJ 1996. A medical record linkage analysis of abortion underreporting. Fam. Plan. Perspect. 28:228–31
    [Google Scholar]
  61. Uhrig N. 2008. The nature and causes of attrition in the British Household Panel Survey ISER Work. Pap. Ser. 2008-05, Inst. Soc. Econ. Res., Univ. Essex, UK
    [Google Scholar]
  62. van der Heijden PGM, van Gils G, Bouts J, Hox JJ 2000. A comparison of randomized response, computer-assisted self-interview, and face-to-face direct questioning: eliciting sensitive information in the context of welfare and unemployment benefit. Sociol. Methods Res. 28:505–37
    [Google Scholar]
  63. Waismel-Manor I, Sarid J. 2011. Can overreporting in surveys be reduced? Evidence from Israel's municipal elections. Int. J. Public Opin. Res. 23:522–29
    [Google Scholar]
  64. Warren JR, Halpern-Manners A. 2012. Panel conditioning in longitudinal social science surveys. Sociol. Methods Res. 41:491–534
    [Google Scholar]
  65. Watson N, Wooden M. 2009. Identifying factors affecting longitudinal survey response. Methodology of Longitudinal Surveys P Lynn 157–81 New York: Wiley
    [Google Scholar]
  66. Wyner GA. 1980. Response errors in self-reported number of arrests. Soc. Methods Res. 9:161–77
    [Google Scholar]
  67. Yan T, Cantor D. 2019. Asking survey questions about criminal justice involvement. Public Health Rep 134:46S–56S
    [Google Scholar]
  68. Yan T, Curtin R. 2010. The relation between unit nonresponse and item nonresponse: a response continuum perspective. Int. J. Public Opin. Res. 22:535–51
    [Google Scholar]
  69. Yan T, Curtin R, Jans M 2010. Trends in income nonresponse over two decades. J. Off. Stat. 26:145–64
    [Google Scholar]
  70. Yang M, Yu Y. 2011. Effects of identifiers in mail surveys. Field Methods 23:243–65
    [Google Scholar]
  71. Zeglovitis E, Kritzinger S. 2014. New attempts to reduce overreporting of voter turnout and their effects. Int. J. Public Opin. Res. 26:224–34
    [Google Scholar]
/content/journals/10.1146/annurev-statistics-040720-033353
Loading
/content/journals/10.1146/annurev-statistics-040720-033353
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error