1932

Abstract

This review focuses on recent methodological and technological developments in survey data collection. Surveys are facing unprecedented challenges from both societal and technological changes. Against this backdrop, I review the survey profession's response to these challenges and developments to enhance and extend the survey tool. I discuss the decline in random digit dialing and the rise of address-based sampling, along with the corresponding shift from telephone surveys to self-administered (mail and/or Web) modes. I discuss the rise in nonprobability sampling approaches, especially those associated with online data collection. I also review so-called big data alternatives to surveys. Finally, I discuss a number of recent methodological and technological trends designed to modernize the survey method. I conclude that although they face a number of major challenges, surveys remain a robust and flexible method for collecting data on, and making inference to, populations.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-soc-060116-053613
2017-07-31
2024-04-20
Loading full text...

Full text loading...

/deliver/fulltext/soc/43/1/annurev-soc-060116-053613.html?itemId=/content/journals/10.1146/annurev-soc-060116-053613&mimeType=html&fmt=ahah

Literature Cited

  1. AAPOR (Am. Assoc. Public Opin. Res). 2010a. New Considerations for Survey Researchers when Planning and Conducting RDD Telephone Surveys in the U.S. with Respondents Reached via Cell Phone Numbers Deerfield, IL: AAPOR http://www.aapor.org/AAPOR_Main/media/MainSiteFiles/2010AAPORCellPhoneTFReport.pdf
  2. AAPOR (Am. Assoc. Public Opin. Res.). 2010b. AAPOR Report on Online Panels Deerfield, IL: AAPOR
  3. AAPOR (Am. Assoc. Public Opin. Res.). 2013. Report of the AAPOR Task Force on Non-Probability Sampling Deerfield, IL: AAPOR
  4. AAPOR (Am. Assoc. Public Opin. Res.). 2014a. Mobile Technologies for Conducting, Augmenting and Potentially Replacing Surveys: Report of the AAPOR Task Force on Emerging Technologies in Public Opinion Research Deerfield, IL: AAPOR
  5. AAPOR (Am. Assoc. Public Opin. Res.). 2014b. Social Media in Public Opinion Research: Report of the AAPOR Task Force on Emerging Technologies in Public Opinion Research Deerfield, IL: AAPOR
  6. AAPOR (Am. Assoc. Public Opin. Res.). 2015. AAPOR Report on Big Data Deerfield, IL: AAPOR
  7. AAPOR (Am. Assoc. Public Opin. Res.). 2016. Address-Based Sampling: Report of the AAPOR Task Force on Address-Based Sampling Deerfield, IL: AAPOR http://www.aapor.org/Education-Resources/Reports/Address-based-Sampling.aspx
  8. Ang C, Bobrowicz A, Schiano D, Nardi B. 2013. Data in the wild. Interactions 20:239–43 [Google Scholar]
  9. Antenucci D, Cafarella M, Levenstein MC, C, Shapiro MD. 2014. Using Social Media to Measure Labor Market Flows Boston: Natl. Bur. Econ. Res http://www.nber.org/papers/w20010
  10. Antoun C, Zhang C, Conrad FG, Schober MF. 2016. Comparisons of online recruitment strategies for convenience samples: Craigslist, Google AdWords, Facebook, and Amazon Mechanical Turk. Field Methods 28:3231–46 [Google Scholar]
  11. Baker-Prewitt J. 2010. Looking beyond quality differences: How do consumer buying patterns differ by sample source? Presented at CASRO Panel Conf., Feb. 24–25 New Orleans:
  12. Battaglia MP, Link MW, Frankel MR, Osborn L, Mokdad AH. 2008. An evaluation of respondent selection methods for household mail surveys. Public Opin. Q. 72:3459–69 [Google Scholar]
  13. Bauermeister JA, Zimmerman MA, Johns MM, Glowacki P, Stoddard S, Volz E. 2012. Innovative recruitment using online networks: lessons learned from an online study of alcohol and other drug use utilizing a web-based, respondent-driven sampling (webRDS) strategy. J. Stud. Alcohol Drugs 73:5834–38 [Google Scholar]
  14. Baumgardner S, Griffin DH, Raglin DA. 2014. The Effects of Adding an Internet Response Option to the American Community Survey Washington, DC: US Census Bur http://census.gov/content/dam/Census/library/working-papers/2014/acs/2014_Baumgardner_04.pdf
  15. Beaumont J-F, Bocci C, Haziza D. 2014. An adaptive data collection procedure for call prioritization. J. Off. Stat. 30:4607–21 [Google Scholar]
  16. Bengtsson L, Lu X, Nguyen QC, Camitz M, Hoang NL. et al. 2012. Implementation of web-based respondent-driven sampling among men who have sex with men in Vietnam. PLOS ONE 7:11e49417 [Google Scholar]
  17. Berinsky AJ, Huber GA, Lenz GS. 2012. Evaluating online labor markets for experimental research: Amazon.com's Mechanical Turk. Polit. Anal. 20:3351–68 [Google Scholar]
  18. Best J, Witt E. 2016. Implications of hand dialing cell phone sample (on phone room productivity) Presented at Annu. Conf. Am. Assoc. Public Opin. Res., May 12–15 Austin, TX:
  19. Bethlehem J. 2010. Selection bias in web surveys. Int. Stat. Rev. 78:2161–88 [Google Scholar]
  20. Bethlehem J. 2016. Solving the nonresponse problem with sample matching?. Soc. Sci. Comput. Rev. 34:159–77 [Google Scholar]
  21. Bethlehem J, Biffignandi S. 2012. Handbook of Web Surveys New York: Wiley
  22. Beulens B, van der Brakel JA. 2015. Measurement error calibration in mixed-mode sample surveys. Sociol. Methods Res. 44:3392–426 [Google Scholar]
  23. Bhutta CB. 2012. Not by the book: Facebook as a sampling frame. Sociol. Methods Res. 41:157–88 [Google Scholar]
  24. Biemer PP, Chen P, Wang K. 2013. Using level-of-effort paradata in non-response adjustments with application to field surveys. J. R. Stat. Soc. Ser. A 176:1147–68 [Google Scholar]
  25. Blom AG, Bosnjak M, Cornilleau A, Cousteaux A-S, Das M. et al. 2016. A comparison of four probability-based online and mixed-mode panels in Europe. Soc. Sci. Comput. Rev. 34:18–25 [Google Scholar]
  26. Blom AG, Gathmann C, Krieger U. 2015. Setting up an online panel representative of the general population: the German Internet Panel. Field Methods 27:4391–408 [Google Scholar]
  27. Blom AG, Herzing JME, Cornesse C, Sakshaug JW, Krieger U, Bossert D. 2017. Does the recruitment of offline households increase the sample representativeness of probability-based online panels? Evidence from the German Internet Panel. Soc. Sci. Comput. Rev. 35:4 In press [Google Scholar]
  28. Blumberg SJ, Luke JV. 2016. Wireless Substitution: Early Release of Estimates from the National Health Interview Survey, July–December 2015 Hyattsville, MD: Natl. Cent. Health Stat http://www.cdc.gov/nchs/data/nhis/earlyrelease/wireless201605.pdf
  29. Boyle J, Fleeman A, Kennedy C, Lewis F, Weiss A. 2012. Sampling cell phone only households: a comparison of demographic and behavioral characteristics from ABS and cell phone samples. Surv. Pract. 5:1 http://www.surveypractice.org/index.php/SurveyPractice/article/view/62 [Google Scholar]
  30. Brick JM, Andrews WR, Mathiowetz NA. 2016. Single-phase mail survey design for rare population subgroups. Field Methods 28:4381–95 [Google Scholar]
  31. Brick JM, Brick PD, Dipko S, Presser S, Tucker C, Yuan Y. 2007. Cell phone survey feasibility in the U.S.: sampling and calling cell numbers versus landline numbers. Public Opin. Q. 71:123–39 [Google Scholar]
  32. Brick JM, Dipko S, Presser S, Tucker C, Yuan Y. 2006. Nonresponse bias in a dual frame sample of cell and landline numbers. Public Opin. Q. 70:5780–93 [Google Scholar]
  33. Brick JM, Williams D, Montaquila JM. 2011. Address-based sampling for subpopulation surveys. Public Opin. Q. 75:3409–28 [Google Scholar]
  34. Buhrmester M, Kwang T, Gosling SD. 2011. Amazon's Mechanical Turk: a new source of inexpensive, yet high-quality, data. Perspect. Psychol. Sci. 6:13–5 [Google Scholar]
  35. Burkill S, Copas A, Couper MP, Clifton C, Prah P. et al. 2016. Using the web to collect data on sensitive behaviours: a study looking at mode effects on the British National Survey of Sexual Attitudes and Lifestyles. PLOS ONE 11:2e0147983 [Google Scholar]
  36. Buskirk TD, Dutwin D. 2016. Telephone sample surveys: dearly beloved or nearly departed? Trends in errors from dual frame and cell phone RDD surveys in the age of declining response rates Presented at Annu. Conf. Am. Assoc. Public Opin. Res., May 12–15 Austin, TX:
  37. Calinescu M, Schouten B. 2016. Adaptive survey designs for nonresponse and measurement error in multi-purpose surveys. Surv. Res. Methods 10:135–47 [Google Scholar]
  38. Callegaro M. 2013. Paradata in web surveys. Improving Surveys with Paradata: Analytic Uses of Process Information F Kreuter 261–79 New York: Wiley [Google Scholar]
  39. Callegaro M, Baker R, Bethlehem J, Göritz A, Krosnick JA, Lavrakas PJ. 2014. Online Panel Research: A Data Quality Perspective New York: Wiley
  40. Callegaro M, DiSogra C. 2008. Computing response metrics for online panels. Public Opin. Q. 72:51008–32 [Google Scholar]
  41. Callegaro M, Lozar Manfreda K, Vehohar V. 2015. Web Survey Methodology London: Sage
  42. Cernat A, Couper MP, Ofstedal MB. 2016. Estimation of mode effects in the health and retirement study using measurement models. J. Surv. Stat. Methodol. 4:4501–24 [Google Scholar]
  43. Chesnut J. 2013. Model-Based Mode of Data Collection Switching from Internet to Mail in the American Community Survey Washington, DC: US Census Bur http://census.gov/content/dam/Census/library/working-papers/2013/acs/2013_Chesnut_01.pdf
  44. Christian L, Dimock M, Keeter S. 2009. Accurately Locating Where Wireless Respondents Live Requires More Than a Phone Number Washington, DC: Pew Res. Cent http://pewresearch.org/pubs/1278
  45. Christian L, Keeter S, Purcell K, Smith A. 2010. Assessing cell phone noncoverage bias across different topics and subgroups Presented at Annu. Conf. Am. Assoc. Public Opin. Res., 65th Chicago:
  46. Citro CF. 2014. From multiple modes for surveys to multiple data sources for estimates. Surv. Methodol. 40:2137–61 [Google Scholar]
  47. Conrad FG, Schober MF, Antoun C, Hupp AL, Yan HY. 2017. Text interviews on mobile devices. Total Survey Error in Practice P Biemer, S Eckman, B Edwards, E de Leeuw, F Kreuter et al.299–318 New York: Wiley [Google Scholar]
  48. Couper MP. 2000a. Web surveys: a review of issues and approaches. Public Opin. Q. 64:4464–94 [Google Scholar]
  49. Couper MP. 2000b. Usability evaluation of computer assisted survey instruments. Soc. Sci. Comput. Rev. 18:4384–96 [Google Scholar]
  50. Couper MP. 2005. Technology trends in survey data collection. Soc. Sci. Comput. Rev. 23:4486–501 [Google Scholar]
  51. Couper MP. 2007. Issues of representation in eHealth research (with a focus on web surveys). Am. J. Prev. Med. 32:5S83–89 [Google Scholar]
  52. Couper MP. 2008. Designing Effective Web Surveys New York: Cambridge Univ. Press
  53. Couper MP. 2013. Is the sky falling? New technology, changing media, and the future of surveys. Surv. Res. Methods 7:3145–56 [Google Scholar]
  54. Couper MP. 2014. What big data may mean for surveys. Proc. Stat. Canada's 2014 Int. Symp. Methodol. Issues, Gatineau, Quebec http://www.statcan.gc.ca/eng/conferences/symposium2014/program
  55. Couper MP, Antoun C, Mavletova A. 2017. Mobile web surveys: a total survey error perspective. Total Survey Error in Practice P Biemer, S Eckman, B Edwards, E de Leeuw, F Kreuter et al.133–54 New York: Wiley [Google Scholar]
  56. Couper MP, Kelley J, Axinn W, Guyer H, Wagner J, West BT. 2015. Internet and smartphone coverage in a national health survey: implications for alternative modes Presented at Fed. Comm. Stat. Methodol. Res. Conf., Dec. 1–3 Washington, DC:
  57. Couper MP, Wagner J. 2011. Using paradata and responsive design to manage survey nonresponse. Proc. 58th Congr. Int. Stat. Inst., Dublin The Hague, Neth.: Int. Stat. Inst http://2011.isiproceedings.org/papers/450080.pdf [Google Scholar]
  58. De Leeuw ED. 2005. To mix or not to mix data collection modes in surveys. J. Off. Stat. 21:2233–55 [Google Scholar]
  59. De Reuver M, Bouwman H. 2015. Dealing with self-report bias in mobile Internet acceptance and usage studies. Inform. Manag. 52:3287–94 [Google Scholar]
  60. Dennis JM. 2010. Summary of KnowledgePanel Design Palo Alto, CA: GfK http://www.knowledgenetworks.com/knpanel/docs/KnowledgePanel(R)-Design-Summary-Description.pdf
  61. Dennis JM. 2015. Technical Overview of the AmeriSpeak Panel, NORC's Probability-Based Research Panel. Chicago: NORC http://www.norc.org/PDFs/AmeriSpeak%20Technical%20Overview%202015%2011%2025.pdf
  62. Dillman DA. 1991. The design and administration of mail surveys. Annu. Rev. Sociol. 17:225–49 [Google Scholar]
  63. DiSogra C. 2008. River Samples: A Good Catch for Researchers? Palo Alto, CA: GfK http://www.knowledgenetworks.com/accuracy/fall-winter2008/disogra.html [Google Scholar]
  64. DiSogra C, Callegaro M, Hendarwan E. 2009. Recruiting probability-based web panel members using an address-based sample frame: results from a pilot study conducted by knowledge networks. Proc. Joint Stat. Meet., Surv. Res. Method Sect., Washington, DC, Aug. 1–65270–83 Alexandria, VA: Am. Stat. Assoc. [Google Scholar]
  65. DiSogra C, Hendarwan E. 2012. Two years of seasonal yield variation and response patterns in address-based mail samples Presented at Annu. Meet. Am. Assoc. Public Opin. Res., 67th Orlando, FL:
  66. Dutwin D, Lavrakas PJ. 2016. Trends in telephone outcomes,. 2008–2015 Surv. Pract. 9:3 http://www.surveypractice.org/index.php/SurveyPractice/article/view/346/html_62 [Google Scholar]
  67. Eckman S. 2016. Does the inclusion of non-Internet households in a web panel reduce coverage bias. Soc. Sci. Comput. Rev. 34:141–58 [Google Scholar]
  68. Eckman S, Kreuter F. 2013. Undercoverage rates and undercoverage bias in traditional housing unit listing. Sociol. Methods Res. 42:264–93 [Google Scholar]
  69. Erens B, Burkill S, Couper MP, Conrad FG, Clifton S. et al. 2014. Nonprobability web surveys to measure sexual behaviors and attitudes in the general population: a comparison with a probability sample interview survey. J. Med. Internet Res 1612e276 [Google Scholar]
  70. Foster I, Ghani R, Jarmin RS, Kreuter F, Lane J. 2016. Big Data and Social Science: A Practical Guide to Methods and Tools Boca Raton, FL: Chapman & Hall/CRC Press
  71. Gelman M, Kariv S, Shapiro MD, Silverman D, Tadelis S. 2014. Harnessing naturally occurring data to measure the response of spending to income. Science 345:6193212–15 [Google Scholar]
  72. Gile KJ, Handcock MS. 2010. Respondent-driven sampling: an assessment of current methodology. Sociol. Methodol. 40:285–327 [Google Scholar]
  73. Groves RM. 1990. Theories and methods of telephone surveys. Annu. Rev. Sociol. 16:221–40 [Google Scholar]
  74. Groves RM. 2011. Three eras of survey research. Public Opin. Q. 75:5861–71 [Google Scholar]
  75. Groves RM. 2015. Improving government, academic and industry data-sharing opportunities. See Krosnick et al. 2015 130–32
  76. Groves RM, Biemer PP, Lyberg LE, Massey JT, Nicholls WL, Waksberg J. 1988. Telephone Survey Methodology New York: Wiley
  77. Groves RM, Couper MP. 1998. Nonresponse in Household Interview Surveys New York: Wiley
  78. Groves RM, Dillman DA, Eltinge JL, Little RJA. 2002. Survey Nonresponse New York: Wiley
  79. Groves RM, Heeringa SG. 2006. Responsive design for household surveys: tools for actively controlling survey nonresponse and costs. J. R. Stat. Soc. Ser. A 169:3439–57 [Google Scholar]
  80. Groves RM, Kahn RL. 1979. Surveys by Telephone: A National Comparison with Personal Interviews New York: Academic
  81. Groves RM, Lepkowski JM. 1985. Dual frame, mixed mode survey designs. J. Off. Stat. 1:3263–86 [Google Scholar]
  82. Guterbock TM, Benson G, Dutwin D, Kelly J, Lavrakas PJ. 2016. How cell phone interviewing costs are changing—and why Presented at Annu. Conf. Am. Assoc. Public Opin. Res., May 12–15 Austin, TX:
  83. Hays RD, Liu H, Kapteyn A. 2015. Use of Internet panels to conduct surveys. Behav. Res. Methods 47:3685–90 [Google Scholar]
  84. Heckathorn DD. 1997. Respondent-driven sampling: a new approach to the study of hidden populations. Soc. Probl. 44:2174–99 [Google Scholar]
  85. Heerwegh D. 2011. Internet survey paradata. Social Research and the Internet M Das, P Ester, L Kaczmirek 325–48 New York: Taylor & Francis [Google Scholar]
  86. Holmberg A, Lorenc B, Werner P. 2010. Contact strategies to improve participation via the web in a mixed-mode mail and web survey. J. Off. Stat. 26:3465–80 [Google Scholar]
  87. Iannacchione VG. 2011. The changing role of address-based sampling in survey research. Public Opin. Q. 75:3556–75 [Google Scholar]
  88. Iannacchione VG, Staab JM, Redden DT. 2003. Evaluating the use of residential mailing lists in a metropolitan household survey. Public Opin. Q. 67:202–10 [Google Scholar]
  89. Jäckle A, Lynn P, Burton J. 2015. Going online with a face-to-face household panel: effects of a mixed mode design on item and unit non-response. Surv. Res. Methods 9:155–70 [Google Scholar]
  90. Jäckle A, Roberts C, Lynn P. 2010. Assessing the effect of data collection mode on measurement. Int. Stat. Rev. 78:13–20 [Google Scholar]
  91. Keeter S, Christian LM. 2012. A Comparison of Results from Surveys by the Pew Research Center and Google Consumer Surveys Washington, DC: Pew Res. Cent.
  92. Keeter S, Weisel R. 2015. Building Pew Research Center's American Trends Panel Pew Res. Cent. Rep., April. http://www.pewresearch.org/files/2015/04/2015-04-08_building-the-ATP_FINAL.pdf
  93. Kennedy C. 2007. Evaluating the effects of screening for telephone service in dual frame RDD surveys. Public Opin. Q. 71:5750–71 [Google Scholar]
  94. Kennedy C, McGeeney K, Keeter S. 2016. The twilight of landline interviewing Pew Res. Cent. Rep., Aug. 1. http://www.pewresearch.org/2016/08/01/the-twilight-of-landline-interviewing/
  95. Khare M. 2016. Estimated prevalence and characteristics of web users: National Health Interview Survey, 2014–2015. Proc. Joint Stat. Meet.660–69 Alexandria, VA: Am. Stat. Assoc. [Google Scholar]
  96. Kirgis N, Lepkowski JM. 2013. Design and management strategies for paradata-driven responsive design. Improving Surveys with Paradata: Analytic Uses of Survey Process Information F Kreuter 123–44 New York: Wiley [Google Scholar]
  97. Klausch T, Hox JJ, Schouten B. 2015a. Selection error in single- and mixed-mode surveys of the Dutch general population. J. R. Stat. Soc. Ser. A 178:4945–61 [Google Scholar]
  98. Klausch T, Schouten B, Hox JJ. 2015b. Evaluating bias of sequential mixed-mode designs against benchmark surveys. Sociol. Methods Res. https://doi.org/10.1177/0049124115585362 [Crossref]
  99. Kohut A, Keeter S, Doherty C, Dimock M, Christian L. 2012. Assessing the Representativeness of Public Opinion Surveys Washington, DC: Pew Res. Cent http://www.people-press.org/2012/05/15/assessing-the-representativeness-of-public-opinion-surveys/
  100. Kreuter F. 2013. Improving Surveys with Paradata: Analytic Uses of Process Information New York: Wiley
  101. Kreuter F, Presser S, Tourangeau R. 2008. Social desirability bias in CATI, IVR, and web surveys: the effects of mode and question sensitivity. Public Opin. Q. 72:5847–65 [Google Scholar]
  102. Krosnick JA, Presser S, Fealing KH, Ruggles S, Vannette DL. 2015. The Future of Survey Research: Challenges and Opportunities Arlington, VA: Natl. Sci. Found http://www.nsf.gov/sbe/AC_Materials/The_Future_of_Survey_Research.pdf
  103. Krueger BS, West BT. 2014. Assessing the potential of paradata and other auxiliary information for nonresponse adjustments. Public Opin. Q. 78:4795–831 [Google Scholar]
  104. Laflamme F, Wagner J. 2016. Responsive and adaptive designs. The SAGE Handbook of Survey Methodology C Wolf, D Joye, TW Smith, Y Fu 397–408 Thousand Oaks, CA: Sage [Google Scholar]
  105. Lampe C, Pasek J, Guggenheim L, Conrad F, Schober M. 2014. When are big data methods trustworthy for social measurement? Presented at Annu. Meet. Am. Assoc. Public Opin. Res., May 15–18 Anaheim, CA: [Google Scholar]
  106. Leenheer J, Scherpenzeel AC. 2013. Does it pay off to include non-Internet households in an Internet panel. Int. J. Internet Sci. 8:117–29 [Google Scholar]
  107. Lepkowski JM, Tucker C, Brick JM, de Leeuw ED, Japec L. et al. 2008. Advances in Telephone Survey Methodology New York: Wiley
  108. Levine B, Krotki K, Bobashev G. 2016. Introducing inbound calling survey: a new sampling methodology Presented at Annu. Conf. Am. Assoc. Public Opin. Res., May 12–15 Austin, TX:
  109. Link MW, Mokdad AH, Kulp D, Hyon A. 2006. Has the National Do Not Call Registry helped or hurt state-level response rates? A time series analysis. Public Opin. Q. 70:5794–809 [Google Scholar]
  110. Liu M. 2016. Comparing data quality between online panel and river samples Presented at Gen. Online Res. Conf., March 2–4 Dresden, Ger.:
  111. Lozar Manfreda K, Bosnjak M, Berzelak J, Haas I, Vehovar V. 2008. Web surveys versus other survey modes: a meta-analysis comparing response rates. Int. J. Mark. Res. 50:179–104 [Google Scholar]
  112. Lundquist P, Särndal C-E. 2013. Aspects of responsive design with applications to the Swedish Living Conditions Survey. J. Off. Stat. 29:4557–82 [Google Scholar]
  113. Matthews B, Davis MC, Tancreto JG, Zelenak MF, Ruiter M. 2012. 2011 American Community Survey Internet Tests: Results from Second Test in November 2011 Washington, DC: US Census Bur http://www.census.gov/acs/www/Downloads/library/2012/2012_Matthews_01.pdf
  114. Mayer-Schönberger V, Cukier K. 2013. Big Data: A Revolution That Will Transform How We Live, Work, and Think New York: Houghton Mifflin
  115. McDonald P, Mohebbi M, Slatkin B. 2012. Comparing Google Consumer Surveys to Existing Probability and Non-Probability Based Internet Surveys Mountain View, CA: Google Inc http://www.google.com/insights/consumersurveys/static/358002174745700394/consumer_surveys_whitepaper.pdf
  116. Medway R, Fulton J. 2012. When more gets you less: a meta-analysis of the effect of concurrent web options on mail survey response rates. Public Opin. Q. 76:4733–46 [Google Scholar]
  117. Mehl MR, Conner TS, Csikszentmihalyi M. 2012. Handbook of Research Methods for Studying Daily Life New York: Guildford Press
  118. Messer BL, Dillman DA. 2011. Surveying the general public over the Internet using address-based sampling and mail contact procedures. Public Opin. Q. 75:429–57 [Google Scholar]
  119. Meyer BD, Mok WKC, Sullivan JX. 2015. Household surveys in crisis Work. Pap. No. 21399, Natl. Bur. Econ. Res. Cambridge, MA: http://www.nber.org/papers/w21399
  120. Millar MM, Dillman DA. 2011. Improving response to web and mixed-mode surveys. Public Opin. Q. 75:2249–69 [Google Scholar]
  121. Moffitt R, Schoeni RF, Brown C, Chase-Lansdale PL, Couper MP. et al. 2015. Assessing the need for a new nationally representative household panel survey in the United States. J. Econ. Soc. Meas. 40:1–26 [Google Scholar]
  122. Mohl C, Laflamme F. 2007. Research and responsive design options for survey data collection at Statistics Canada Presented at Joint Stat. Meet., July 29–Aug. 2 Salt Lake City, UT:
  123. Montaquila JM, Brick JM, Williams D, Kim K, Han D. 2013. A study of two-phase mail survey data collection methods. J. Surv. Stat. Methodol. 1:66–87 [Google Scholar]
  124. Mutz DC. 2011. Population-Based Survey Experiments Princeton, NJ: Princeton Univ. Press
  125. Nelson EJ, Hughes J, Oakes JM, Pankow JS, Kulasingam SL. 2014. Estimation of geographic variation in human papillomavirus vaccine uptake in men and women: an online survey using Facebook recruitment. J. Med. Internet Res 169e198 [Google Scholar]
  126. O'Donovan G, Shave R. 2007. British adults’ views on the health benefits of moderate and vigorous activity. Prev. Med. 45:6432–35 [Google Scholar]
  127. Olson K. 2013. Paradata for nonresponse adjustment. Ann. Am. Acad. Polit. Soc. Sci. 645:1142–70 [Google Scholar]
  128. Olson K, Smyth JD. 2014. Accuracy of within-household selection in web and mail surveys of the general population. Field Methods 26:156–69 [Google Scholar]
  129. Olson K, Smyth JD, Wood HM. 2012. Does giving people their preferred survey mode actually increase survey participation rates? An experimental examination. Public Opin. Q. 76:4611–35 [Google Scholar]
  130. Patrick ME, Couper MP, Laetz VB, Schulenberg JE, O'Malley PM. et al. 2017. A sequential mixed mode experiment in the U.S. national Monitoring the Future study. J. Surv. Stat. Methodol. In press
  131. Payne SL. 1964. Combination of survey methods. J. Mark. Res. 1:261–62 [Google Scholar]
  132. Perrin A, Duggan M. 2015. Americans’ Internet Access: 2000–2015 Washington, DC: Pew Res. Cent.
  133. Peytchev A, Neely B. 2013. RDD telephone surveys: toward a single frame cell phone design. Public Opin. Q. 77:2283–304 [Google Scholar]
  134. Peytchev A, Riley S, Rosen J, Murphy J, Lindblad M. 2010. Reduction of nonresponse bias in surveys through case prioritization. Surv. Res. Methods 4:121–29 [Google Scholar]
  135. Pinter R. 2015. Willingness of online access panel members to participate in smartphone application-based research. Mobile Research Methods: Opportunities and Challenges of Mobile Research Methodologies D Toninelli, R Pinter, P de Pedraza 141–56 London: Ubiquity Press [Google Scholar]
  136. Presser S, McCulloch S. 2011. The growth of survey research in the United States, 1984–2004. Soc. Sci. Res. 40:1019–24 [Google Scholar]
  137. Putnam RD. 2000. Bowling Alone: The Collapse and Revival of American Community New York: Simon & Schuster
  138. Revilla M. 2015. Comparison of the quality estimates in a mixed-mode and a unimode design: an experiment from the European Social Survey. Qual. Quant. 49:31219–38 [Google Scholar]
  139. Revilla M, Cornilleau A, Cousteaux A-S, Legleye S, de Pedraza P. 2016a. What is the gain in a probability-based online panel of providing Internet access to sampling units who previously had no access?. Soc. Sci. Comput. Rev. 34:4479–96 [Google Scholar]
  140. Revilla M, Ochoa C, Loewe G. 2016b. Using passive data from a meter to complement survey data in order to study online behavior. Soc. Sci. Comput. Rev. https://doi.org/10.1177/0894439316638457 [Crossref]
  141. Rivers D. 2007. Sample matching for web surveys: theory and application Presented at Joint Stat. Meet., July 29–Aug. 2 Salt Lake City, UT:
  142. Sakshaug JW, Couper MP, Ofstedal MB. 2010. Characteristics of physical measurement consent in a population-based survey of older adults. Med. Care 48164–71 [Google Scholar]
  143. Sakshaug JW, Couper MP, Ofstedal MB, Weir D. 2012. Linking survey and administrative records: mechanisms of consent. Sociol. Methods Res. 41:4535–69 [Google Scholar]
  144. Sakshaug JW, Weir D, Nicholas LH. 2014. Identifying diabetics in Medicare claims and survey data: implications for health services research. BMC Health Serv. Res. 14:150 https://doi.org/10.1186/1472-6963-14-150 [Crossref] [Google Scholar]
  145. Sala E, Burton J, Knies G. 2012. Correlates of obtaining informed consent to data linkage: respondent, interview, and interviewer characteristics. Sociol. Methods Res. 41:3414–39 [Google Scholar]
  146. Savage M, Burrows R. 2007. The coming crisis of empirical sociology. Sociology 41:5885–99 [Google Scholar]
  147. Scherpenzeel A. 2011. Data collection in a probability-based Internet panel: how the LISS panel was built and how it can be used. Bull. Sociol. Methodol./Bull. Méthodol. Sociol. 109:156–61 [Google Scholar]
  148. Scherpenzeel A. 2014. Survey participation in a probability-based Internet panel in the Netherlands. Improving Survey Methods U Engel, B Jann, P Lynn, A Scherpenzeel, P Sturgis 223–35 London: Routledge [Google Scholar]
  149. Schober MF, Conrad FG, Antoun C, Ehlen P, Fail S. et al. 2015. Precision and disclosure in text and voice interviews on smartphones. PLOS ONE 10:6e0128337 [Google Scholar]
  150. Schober MF, Pasek J, Guggenheim L, Lampe C, Conrad FG. 2016. Social media analyses for social measurement. Public Opin. Q. 80:1180–211 [Google Scholar]
  151. Schonlau M, Couper MP. 2017. Options for conducting web surveys. Stat. Sci. 32:2279–92 [Google Scholar]
  152. Schonlau M, Weidmer B, Kapteyn A. 2014. Recruiting an Internet panel using respondent-driven sampling. J. Off. Stat. 30:2291–310 [Google Scholar]
  153. Schouten B, Calinescu M, Luiten A. 2013a. Optimizing quality of response through adaptive survey designs. Surv. Methodol. 39:129–58 [Google Scholar]
  154. Schouten B, Cobben F, Bethlehem JG. 2009. Indicators for the representativeness of survey response. Surv. Methodol. 35:1101–13 [Google Scholar]
  155. Schouten B, van den Brakel J, Buelens B, van der Laan J, Klausch T. 2013b. Disentangling mode-specific selection and measurement bias in social surveys. Soc. Sci. Res. 42:61555–70 [Google Scholar]
  156. Seeman S, Tang S, Brown AD, Ing A. 2016. World survey of mental illness stigma. J. Affect. Disord. 190:115–21 [Google Scholar]
  157. Shih T-H, Fan X. 2008. Comparing response rates from web and mail surveys: a meta-analysis. Field Methods 20:3249–71 [Google Scholar]
  158. Smith TW. 2013. Survey-research paradigms old and new. Int. J. Public Opin. Res. 25:2218–29 [Google Scholar]
  159. Smyth JD, Dillman DA, Christian LM, O'Neill AC. 2010. Using the Internet to survey small towns and communities: limitations and possibilities in the early 21st century. Am. Behav. Sci. 53:91423–48 [Google Scholar]
  160. Stange M, Smyth JD, Olson K. 2016. Using a calendar and explanatory instructions to aid within-household selection in mail surveys. Field Methods 18:164–78 [Google Scholar]
  161. Stein ML, van Steenbergen JE, Chanyasanha C, Tipayamongkholgul M, Buskens V. et al. 2014. Online respondent-driven sampling for studying contact patterns relevant for the spread of close-contact pathogens: a pilot study in Thailand. PLOS ONE 9:1e85256 [Google Scholar]
  162. Stone AA, Shifflan S, Atienza AA, Nebeling L. 2007. The Science of Real-Time Data Capture New York: Oxford Univ. Press
  163. Tancreto J, Zelenak MF, Davis MC, Ruiter M, Matthews B. 2012. 2011 American Community Survey Internet Tests: Results from First Test in April 2011 Washington, DC: US Census Bur http://www.census.gov/content/dam/Census/library/working-papers/2012/acs/2012_Tancreto_01.pdf
  164. Toepoel V. 2016. Doing Surveys Online London: Sage
  165. Tourangeau R. 2007. Incentives, falling response rates, and the respondent-researcher relationship. Proc. Ninth Conf. Health Surv. Res. Methods244–53 Hyattsville, MD: Natl. Cent. Health Stat. [Google Scholar]
  166. Tourangeau R, Conrad FG, Couper MP. 2013. The Science of Web Surveys New York: Oxford Univ. Press
  167. Traugott MW, Groves RM, Lepkowski JM. 1987. Using dual frame designs to reduce nonresponse in telephone surveys. Public Opin. Q. 51:4522–39 [Google Scholar]
  168. Valliant R, Hubbard F, Lee S, Chang C. 2014. Efficient use of commercial lists in household sampling. J. Surv. Stat. Methodol. 2:2182–209 [Google Scholar]
  169. Vannieuwenhuyze JTA. 2014. On the relative advantage of mixed-mode versus single-mode surveys. Surv. Res. Methods 8:131–42 [Google Scholar]
  170. Vannieuwenhuyze JTA, Loosveldt G. 2013. Evaluating relative mode effects in mixed-mode surveys: three methods to disentangle selection and measurement effects. Sociol. Methods Res. 42:182–104 [Google Scholar]
  171. Vannieuwenhuyze JTA, Revilla M. 2013. Relative mode effects on data quality in mixed-mode surveys by an instrumental variable. Surv. Res. Methods 7:3157–68 [Google Scholar]
  172. Vavreck L, Rivers D. 2008. The 2006 Cooperative Congressional Election Study. J. Elect. Public Opin. Parties 18:4355–66 [Google Scholar]
  173. Wagner J. 2010. The fraction of missing information as a tool for monitoring the quality of survey data. Public Opin. Q. 74:2223–43 [Google Scholar]
  174. Wagner J, Schroeder HM, Piskorowski A, Ursano RJ, Stein MB. et al. 2017. Timing the mode switch in a sequential mixed-mode survey: an experimental evaluation of the impact on final response rates, key estimates, and costs. Soc. Sci. Comput. Rev. 35:2262–76 [Google Scholar]
  175. Wagner J, West BT, Kirgis N, Lepkowski JM, Axinn WG, Kruger-Ndiaye S. 2012. Use of paradata in a responsive design framework to manage a field data collection. J. Off. Stat. 28:4477–99 [Google Scholar]
  176. Wang W, Rothschild D, Goel S, Gelman A. 2015. Forecasting elections with non-representative polls. Int. J. Forecast. 31:3980–91 [Google Scholar]
  177. Wejnert C, Heckathorn DD. 2008. Web-based network sampling: efficiency and efficacy of respondent-driven sampling for online research. Sociol. Methods Res. 37:1105–34 [Google Scholar]
  178. Wells T, Bailey J, Link M. 2014. Comparison of smartphone and online computer survey administration. Soc. Sci. Comput. Rev. 32:2238–55 [Google Scholar]
  179. Wengrzik J, Bosnjak M, Lozar Manfreda K. 2016. Web surveys versus other survey modes—a meta-analysis comparing response rates Presented at Gen. Online Res. Conf., March 2–4 Dresden, Ger.:
  180. West BT, Wagner J, Gu H, Hubbard F. 2015. The utility of alternative commercial data sources for survey operations and estimation: evidence from the National Survey of Family Growth. J. Surv. Stat. Methodol. 3:2240–64 [Google Scholar]
  181. Wolter KM, Smith PJ, Blumberg SJ. 2010. Statistical foundations of cell-phone surveys. Surv. Methodol. 36:2203–15 [Google Scholar]
  182. Wolter KM, Tao X, Montgomery R, Smith PJ. 2015. Optimum allocation for a dual-frame telephone survey. Surv. Methodol. 41:2389–401 [Google Scholar]
  183. Yeager DS, Krosnick JA, Chang L, Javitz HS, Levendusky MS. et al. 2011. Comparing the accuracy of RDD telephone surveys and Internet surveys conducted with probability and non-probability samples. Public Opin. Q. 75:4709–47 [Google Scholar]
  184. Zewoldi Y. 2011. Introduction: seminar on new technologies in population and housing censuses: country experiences Presented at 42nd Sess. U.N. Stat. Comm., Feb. 24 New York: http://unstats.un.org/unsd/statcom/statcom_2011/Seminars/NewTechnologies/default.html
/content/journals/10.1146/annurev-soc-060116-053613
Loading
  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error