Acta Informatica Pragensia X:X | DOI: 10.18267/j.aip.279226

Data Quality in Estimates from Probability-Based Online Panels: Systematic Review and Meta-Analysis

Andrea Ivanovska ORCID...1, Michael Bosnjak ORCID...1,2, Vasja Vehovar ORCID...1
1 Faculty of Social Sciences, University of Ljubljana, Ljubljana, Slovenia
2 Department of Psychological Research Methods, Trier University, Trier, Germany

Background: General population surveys now increasingly use nonprobability samples from access panels instead of probability-based methods, which often leads to lower-quality estimates. In response, many official and academic surveys have adopted probability-based online panels (PBOPs), which use probability sampling and retain participants for follow-up surveys. While these panels reduce costs compared to one-time surveys, they still face low response rates and other challenges that may affect data quality.

Objective: This study aimed to assess the accuracy of PBOPs by synthesising evidence on relative bias (RB), and to examine how RB varies by country, domain, measurement level, and item sensitivity.

Methods: A systematic review yielded 44 eligible studies from 12 countries, and 1,897 effect sizes of absolute RB from studies that compared PBOP estimates to benchmarks. A three-level random effects meta-analytic model accounted for variance across studies, within studies and sampling variance. Moderator analyses evaluated the influence of country, item topic, measurement level and sensitivity on RB. Sensitivity analyses excluded the top 5% of RB outliers to test robustness.

Results: The pooled RB was 23.14% (95% CI: 18.38%–27.91%) and heterogeneous. Most variance was attributed to within-study item-level differences. Country and topic did not significantly moderate RB. Items with high topic sensitivity had significantly higher RB (+19.33%) than items with no sensitivity.  Ordinal items had significantly lower RB than nominal (–14.90%). However, when sensitivity and measurement level were modelled together, substantial residual heterogeneity remained.

Conclusion: While PBOPs offer cost and logistical advantages, they require careful design considerations to lower substantial bias, especially regarding item sensitivity and measurement scale. PBOPs may not be suitable for certain question types, like sensitive or low-prevalence behaviours, especially when high accuracy is needed. Improved methodological planning and innovations are needed to improve PBOP data quality.

Keywords: Online surveys; Probability-based online panels; Data quality; Relative bias; Meta-analysis.

Received: June 4, 2025; Revised: August 14, 2025; Accepted: August 31, 2025; Prepublished online: September 11, 2025 

Download citation

References

  1. Arcos, A., Rueda, M. d. M., & Pasadas-del-Amo, S. (2020). Treating nonresponse in probability-based online panels through calibration: Empirical evidence from a survey of political decision-making procedures. Mathematics, 8(3), 423. https://doi.org/10.3390/math8030423 Go to original source...
  2. Baker, R., Blumberg, S. J., Brick, J. M., Couper, M. P., Courtright, M., Dennis, J. M., Dillman, D. A., Frankel, M. R., Garland, P., Groves, R. M., Kennedy, C., Krosnick, J. A., Lavrakas, P. J., Lee, S., Link, M. W., Piekarski, L. B., Rao, K. N., Thomas, R. K., & Zahs, D. A. (2010). AAPOR report on online panels. Public Opinion Quarterly, 74, 711-781. https://doi.org/10.1093/poq/nfq048 Go to original source...
  3. Bell, J., Huber, J., & Viscusi, W. K. (2011). Survey mode effects on valuation of environmental goods. International Journal of Environmental Research and Public Health, 8(4), 1222-1243. https://doi.org/10.3390/ijerph8041222 Go to original source...
  4. Berrens, R. P., Bohara, A. K., Jenkins-Smith, H. C., Silva, C. L., & Weimer, D. L. (2003). The advent of internet surveys for political research: A comparison of telephone and internet samples. Political Analysis, 11, 1-22. https://doi.org/10.1093/pan/11.1.1 Go to original source...
  5. Berzelak, J., & Vehovar, V. (2009). Information technology in survey research. In M. Khosrow-Pour (Ed.), Encyclopedia of Information Science and Technology (pp. 2024-2029). IGI Global. https://doi.org/10.4018/978-1-60566-026-4.ch318 Go to original source...
  6. Berzelak, N., & Vehovar, V. (2018). Mode effects on socially desirable responding in web surveys compared to face-to-face and telephone surveys. Advances in Methodology and Statistics, 15(2), 21-43. https://doi.org/10.51936/lrkv4884 Go to original source...
  7. Berzelak, N., Vehovar, V., & Manfreda, K.L. (2025). Nonresponse in Web Surveys. In Lovric, M. (ed) International Encyclopedia of Statistical Science. Springer. https://doi.org/10.1007/978-3-662-69359-9_429 Go to original source...
  8. Bialik, K. (2018, December 6). How asking about your sleep, smoking or yoga habits can help pollsters verify their findings. Pew Research Center. https://www.pewresearch.org/short-reads/2018/12/06/how-asking-about-your-sleep-smoking-or-yoga-habits-can-help-pollsters-verify-their-findings/
  9. Bilgen, I., Dennis, M. J., & Liebert, L. (2018). Nonresponse follow-up impact on AmeriSpeak panel sample composition and representativeness. NORC. https://amerispeak.norc.org/content/dam/amerispeak/research/pdf/Bilgen_etal_WhitePaper1_NRFU_SampleComposition.pdf
  10. Blom, A. G., Bosnjak, M., Cornilleau, A., Cousteaux, A.-S., Das, M., Douhou, S., & Krieger, U. (2016). A comparison of four probability-based online and mixed-mode panels in Europe. Social Science Computer Review, 34(1), 8-25. https://doi.org/10.1177/0894439315574825 Go to original source...
  11. Blom, A. G., Gathmann, C., & Krieger, U. (2015). Setting up an online panel representative of the general population: The German Internet Panel. Field Methods, 27(4), 391-408. https://doi.org/10.1177/1525822X15574494 Go to original source...
  12. Blom, A. G., Herzing, J. M. E., Cornesse, C., Sakshaug, J. W., Krieger, U., & Bossert, D. (2017). Does the recruitment of offline households increase the sample representativeness of probability-based online panels? Evidence from the German Internet Panel. Social Science Computer Review, 35(4), 498-520. https://doi.org/10.1177/0894439316651584 Go to original source...
  13. Boland, M., Sweeney, M. R., Scallan, E., Harrington, M., & Staines, A. (2006). Emerging advantages and drawbacks of telephone surveying in public health research in Ireland and the U.K. BMC Public Health, 6, Article 208. https://doi.org/10.1186/1471-2458-6-208 Go to original source...
  14. Bosch, O. J., & Maslovskaya, O. (2023). GenPopWeb2: The utility of probability-based online surveys - Literature review. NCRM. https://www.ncrm.ac.uk/documents/GenPopWeb2_The%20utility%20of%20probability-based%20online%20surveys_Literature%20review.pdf
  15. Bosnjak, M., Das, M., & Lynn, P. (2016). Methods for Probability-Based Online and Mixed-Mode Panels: Selected Recent Trends and Future Perspectives. Social Science Computer Review, 34(1), 3-7. https://doi.org/10.1177/0894439315579246 Go to original source...
  16. Bottoni, G., & Fitzgerald, R. (2021). Establishing a baseline: Bringing innovation to the evaluation of cross-national probability-based online panels. Survey Research Methods, 15(2), 115-133. https://doi.org/10.18148/srm/2021.v15i2.7457 Go to original source...
  17. Bradley, V. C., Kuriwaki, S., Isakov, M., Sejdinovic, D., Meng, X. L., & Flaxman, S. (2021). Unrepresentative big surveys significantly overestimated US vaccine uptake. Nature, 600(7890), 695-700. https://doi.org/10.1038/s41586-021-04198-4 Go to original source...
  18. Bryda, G., & Costa, A. P. (2023). Qualitative research in digital era: Innovations, methodologies and collaborations. Social Sciences, 12(10), Article 570. https://doi.org/10.3390/socsci12100570 Go to original source...
  19. Büchi, M., Just, N., & Latzer, M. (2016). Modeling the second-level digital divide: A five-country study of social differences in Internet use. New Media & Society, 18(11), 2703-2722. https://doi.org/10.1177/1461444815604154 Go to original source...
  20. Callegaro, M., Baker, R., Bethlehem, J., Göritz, A. S., Krosnick, J. A., & Lavrakas, P. J. (2014). Online panel research. In M. Callegaro, R. Baker, J. Bethlehem, A. S. Göritz, J. A. Krosnick, & P. J. Lavrakas (Eds.), Online panel research: A data quality perspective (pp. 1-22). John Wiley & Sons. https://doi.org/10.1002/9781118763520.ch1 Go to original source...
  21. Callegaro, M., Lozar Manfreda, K., & Vehovar, V. (2015a). Pre-fielding. In Web survey methodology (pp. 35-164). SAGE Publications. Go to original source...
  22. Callegaro, M., Lozar Manfreda, K., & Vehovar, V. (2015b). Selected topics in web survey implementation. In Web survey methodology (pp. 191-230). SAGE Publications. Go to original source...
  23. Callegaro, M., Lozar Manfreda, K., & Vehovar, V. (2015c). Survey research and web surveys. In Web survey methodology (pp. 3-34). SAGE Publications. Go to original source...
  24. Centralna tehni¹ka knji¾nica Univerze v Ljubljani. (2025). DiKUL - Katalog informacijskih virov. https://viri.ctk.uni-lj.si/
  25. Chang, L., & Krosnick, J. A. (2009). National surveys via RDD telephone interviewing versus the Internet: Comparing sample representativeness and response quality. Public Opinion Quarterly, 73(4), 641-678. https://doi.org/10.1093/poq/nfp075 Go to original source...
  26. Cho, S. K., LoCascio, S. P., Lee, K.-O., Jang, D.-H., & Lee, J. M. (2017). Testing the representativeness of a multimode survey in South Korea: Results from KAMOS. Asian Journal for Public Opinion Research, 4(2), 73-87. https://doi.org/10.15206/ajpor.2017.4.2.73 Go to original source...
  27. Cochran, W. G. (1954). The combination of estimates from different experiments. Biometrics, 10(1), 101-129. https://doi.org/10.2307/3001666 Go to original source...
  28. Cornesse, C., & Blom, A. G. (2023). Response quality in nonprobability and probability-based online panels. Sociological Methods & Research, 52(2), 879-908. https://doi.org/10.1177/0049124120914940 Go to original source...
  29. Cornesse, C., & Schaurer, I. (2021). The long-term impact of different offline population inclusion strategies in probability-based online panels: Evidence from the German Internet Panel and the GESIS Panel. Social Science Computer Review, 39(4), 687-704. https://doi.org/10.1177/0894439320984131 Go to original source...
  30. Cornesse, C., Felderer, B., Fikel, M., Krieger, U., & Blom, A. G. (2022a). Recruiting a probability-based online panel via postal mail: Experimental evidence. Social Science Computer Review, 40(5), 1259-1284. https://doi.org/10.1177/08944393211006059 Go to original source...
  31. Cornesse, C., Krieger, U., Sohnius, M.-L., Fikel, M., Friedel, S., Rettig, T., Wenz, A., Juhl, S., Lehrer, R., Möhring, K., Naumann, E., Reifenscheid, M., & Blom, A. G. (2022b). From German Internet Panel to Mannheim Corona Study: Adaptable probability-based online panel infrastructures during the pandemic. Journal of the Royal Statistical Society: Series A (Statistics in Society), 185(3), 773-797. https://doi.org/10.1111/rssa.12749 Go to original source...
  32. de Leeuw, E. D. (2008). Choosing the method of data collection. In E. D. de Leeuw, J. J. Hox, & D. A. Dillman (Eds.), International handbook of survey methodology (pp. 113-135). Lawrence Erlbaum Associates.
  33. de Leeuw, E. D., & Nicholls, W. L., II. (1996). Technological innovations in data collection: Acceptance, data quality and costs. Sociological Research Online, 1(4), 23-37. https://doi.org/10.5153/sro.50 Go to original source...
  34. Daikeler, J., Bo¹njak, M., & Lozar Manfreda, K. (2020). Web versus other survey modes: An updated and extended meta-analysis comparing response rates. Journal of Survey Statistics and Methodology, 8(3), 513-539. https://doi.org/10.1093/jssam/smz008 Go to original source...
  35. Dever, J. A., Amaya, A., Srivastav, A., Roy, K., & Singleton, J. A. (2021). Fit for purpose in action: Design, implementation, and evaluation of the National Internet Flu Survey. Journal of Survey Statistics and Methodology, 9(3). https://doi.org/10.1093/jssam/smaa005 Go to original source...
  36. Dickie, M., Gerking, S., & Goffe, W. L. (2007). Valuation of non-market goods using computer-assisted surveys: A comparison of data quality from internet and RDD sample. https://cook.rfe.org/Survey_Comparison_3.pdf
  37. Dillman, D. A. (2007). Introduction to tailored design. In Mail and internet surveys: The tailored design method (pp. 3-31). John Wiley & Sons.
  38. DiSogra, C., & Callegaro, M. (2015). Metrics and design tool for building and evaluating probability-based online panels. Social Science Computer Review, 34(1), 26-40. https://doi.org/10.1177/0894439315573925 Go to original source...
  39. Eckman, S. (2015). Does the inclusion of non-Internet households in a web panel reduce coverage bias? Social Science Computer Review, 34(1), 41-58. https://doi.org/10.1177/0894439315572985 Go to original source...
  40. Eckman, S., Unangst, J., Dever, J. A., & Antoun, C. (2023). The precision of estimates of nonresponse bias in means. Journal of Survey Statistics and Methodology, 11(4), 758-783. https://doi.org/10.1093/jssam/smac019 Go to original source...
  41. Felderer, B., Repke, L., Weber, W., Schweisthal, J., & Bothmann, L. (2024). Predicting the validity and reliability of survey questions. OSF Preprints. https://doi.org/10.31219/osf.io/hkngd Go to original source...
  42. Gaia, A., Sala, E., & Respi, C. (2025). Internet Coverage Bias in Web Surveys in Europe. Survey Research Methods, 19(2), 153-174. https://doi.org/10.18148/srm/2025.v19i2.8298 Go to original source...
  43. Goodman, A., Brown, M., Silverwood, R. J., Sakshaug, J. W., Calderwood, L., Williams, J., & Ploubidis, G. B. (2022). The impact of using the web in a mixed-mode follow-up of a longitudinal birth cohort study: Evidence from the National Child Development Study. Journal of the Royal Statistical Society: Series A (Statistics in Society), 185(3), 822-850. https://doi.org/10.1111/rssa.12786 Go to original source...
  44. Grönlund, K., & Strandberg, K. (2014). Online panels and validity: Representativeness and attrition in the Finnish eOpinion panel. In M. Callegaro, R. Baker, J. Bethlehem, A. S. Göritz, J. A. Krosnick, & P. J. Lavrakas (Eds.), Online panel research: A data quality perspective (pp. 86-103). John Wiley & Sons. Go to original source...
  45. Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72(2), 167-189. https://doi.org/10.1093/poq/nfn011 Go to original source...
  46. Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E., & Tourangeau, R. (2009). An introduction to survey methodology (Chapter 1). In Survey methodology (pp. 1-38). Wiley.
  47. Harrer, M., Cuijpers, P., Furukawa, T., & Ebert, D. (2021). Multilevel meta-analysis. In Doing meta-analysis with R: A hands-on guide (pp. 287-302). Chapman and Hall/CRC. https://doi.org/10.1201/9781003107347 Go to original source...
  48. Hemsworth, L. M., Rice, M., Hemsworth, P. H., & Coleman, G. J. (2021). Telephone survey versus panel survey samples assessing knowledge, attitudes and behavior regarding animal welfare in the red meat industry in Australia. Frontiers in Psychology, 12, Article 653620. https://doi.org/10.3389/fpsyg.2021.653620 Go to original source...
  49. Herman, P. M., Slaughter, M. E., Qureshi, N., & Lang, D. L. (2024). Comparing health survey data cost and quality between Amazon's Mechanical Turk and Ipsos' KnowledgePanel: An observational study. Journal of Medical Internet Research, 26(4), Article e56794. https://doi.org/10.2196/56794 Go to original source...
  50. Hernandez, K., & Faith, B. (2023). Online but still falling behind: Measuring barriers to internet use 'after access'. Internet Policy Review, 12(2), Article 1713. https://doi.org/10.14763/2023.2.1713 Go to original source...
  51. Higgins, J. P. T., & Thompson, S. G. (2002). Quantifying heterogeneity in a meta-analysis. Statistics in Medicine, 21(11), 1539-1558. https://doi.org/10.1002/sim.1186 Go to original source...
  52. Høgestøl, A., & Skjervheim, Ø. (2014). Norwegian citizen panel, 2013, first wave. Methodology report. https://www.uib.no/sites/w3.uib.no/files/attachments/ncp-methodology-report-wave-1.pdf
  53. Huggins, V., & Eyerman, J. (2001). Probability based Internet surveys: A synopsis of early methods and survey research results. In Research Conference for the Federal Committee on Statistical Methodology, Arlington, VA. https://nces.ed.gov/FCSM/pdf/2001FCSM_Huggins.pdf
  54. Huijsmans, T., Harteveld, E., van der Brug, W., & Lancee, B. (2021). Are cities ever more cosmopolitan? Studying trends in urban-rural divergence of cultural attitudes. Political Geography, 86, 102353. https://doi.org/10.1016/j.polgeo.2021.102353 Go to original source...
  55. Idika, D. O., Owan, V. J., & Agama, V. U. (2023). The application of the nominal scale of measurement in research data analysis. Prestige Journal of Education, 6(1), 190-197.
  56. Ji, L. J., Peng, K., & Nisbett, R. E. (2000). Culture, control, and perception of relationships in the environment. Journal of Personality and Social Psychology, 78(5), 943-955. https://doi.org/10.1037//0022-3514.78.5.943 Go to original source...
  57. Kaczmirek, L., Phillips, B., Pennay, D. W., Lavrakas, P. J., & Neiger, D. (2019). Building a probability-based online panel: Life in Australia (CSRM Methods Series, No. 2/2019). Centre for Social Research and Methods, Australian National University. https://csrm.cass.anu.edu.au/research/publications/building-probability-based-online-panel-life-australia
  58. Kaufman, D. J., Baker, R., Milner, L. C., Devaney, S., & Hudson, K. L. (2016). A survey of U.S. adults' opinions about conduct of a nationwide Precision Medicine Initiative® cohort study of genes and environment. PLOS ONE, 11(8), e0160461. https://doi.org/10.1371/journal.pone.0160461 Go to original source...
  59. Kennedy, C., Mercer, A., Keeter, S., Hatley, N., McGeeney, K., & Gimenez, A. (2016). Evaluating online nonprobability surveys: Vendor choice matters; widespread errors found for estimates based on Blacks and Hispanics. Pew Research Center. https://www.pewresearch.org/methods/2016/05/02/evaluating-online-nonprobability-surveys/
  60. Kocar, S., & Baffour, B. (2023). Comparing and improving the accuracy of nonprobability samples: Profiling Australian surveys. Methods, Data, Analyses, 17(2). https://doi.org/10.12758/MDA.2023.04 Go to original source...
  61. Kocar, S., & Biddle, N. (2023). Do we have to mix modes in probability-based online panel research to obtain more accurate results? Methods, Data, Analyses, 17(1), Article 11. https://doi.org/10.12758/mda.2022.11 Go to original source...
  62. Kocar, S., & Kaczmirek, L. (2023). A meta-analysis of worldwide recruitment rates in 23 probability-based online panels, between 2007 and 2019. International Journal of Social Research Methodology, 27(5), 589-604. https://doi.org/10.1080/13645579.2023.2242202 Go to original source...
  63. Kordzadeh, N., & Ghasemaghaei, M. (2021). Algorithmic bias: review, synthesis, and future research directions. European Journal of Information Systems, 31(3), 388-409. https://doi.org/10.1080/0960085X.2021.1927212 Go to original source...
  64. Lalla, M. (2017). Fundamental characteristics and statistical analysis of ordinal variables: A review. Quality & Quantity, 51(1), 435-458. https://doi.org/10.1007/s11135-016-0314-5 Go to original source...
  65. Lavrakas, P. J., Pennay, D., Neiger, D., & Phillips, B. (2022). Comparing probability-based surveys and nonprobability online panel surveys in Australia: A total survey error perspective. Survey Research Methods, 16(2), 241-266. https://doi.org/10.18148/srm/2022.v16i2.7907 Go to original source...
  66. Lee, S. (2006). An evaluation of nonresponse and coverage errors in a prerecruited probability web panel survey. Social Science Computer Review, 24(4), 460-475. https://doi.org/10.1177/0894439306288085 Go to original source...
  67. Leenheer, J., & Scherpenzeel, A. C. (2013). Does it pay off to include non-internet households in an internet panel? International Journal of Internet Science, 8(1), 17-29.
  68. Liddell, T. M., & Kruschke, J. K. (2018). Analyzing ordinal data with metric models: What could possibly go wrong? Journal of Experimental Social Psychology, 79, 328-348. https://doi.org/10.1016/j.jesp.2018.08.009 Go to original source...
  69. Liu, S. T., Loomis, B. R., Kinsey, S. H., & Taylor, N. C. (2022). Development of a panel of US adult tobacco users to inform tobacco regulatory science. Preventive Medicine Reports, 28, 101827. https://doi.org/10.1016/j.pmedr.2022.101827 Go to original source...
  70. Lorenz, M., & Koneèný, M. (2023). Digital archives as research infrastructure of the future. Acta Informatica Pragensia, 12(2), 327-341. https://doi.org/10.18267/j.aip.219 Go to original source...
  71. Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web Surveys versus other Survey Modes: A Meta-Analysis Comparing Response Rates. International Journal of Market Research, 50(1), 79-104. https://doi.org/10.1177/147078530805000107 Go to original source...
  72. Lozar Manfreda, K., Couper, M., Vohar, M., Rivas, S., & Vehovar, V. (2002). Virtual selves and web surveys. In A. Ferligoj & A. Mrvar (Eds.), Developments in social science methodology (pp. 187-213). University of Ljubljana, FDV.
  73. Lugtig, P., Das, M., & Scherpenzeel, A. (2014). Nonresponse and attrition in a probability-based online panel for the general population. In M. Callegaro, R. Baker, J. Bethlehem, A. S. Göritz, J. A. Krosnick, & P. J. Lavrakas (Eds.), Online panel research: A data quality perspective (pp. 135-153). John Wiley & Sons. https://doi.org/10.1002/9781118763520.ch6 Go to original source...
  74. MacInnis, B., Krosnick, J. A., Ho, A. S., & Cho, M.-J. (2018). The accuracy of measurements with probability and nonprobability survey samples: Replication and extension. Public Opinion Quarterly, 82(4), 707-744. https://doi.org/10.1093/poq/nfy038 Go to original source...
  75. Maslovskaya, O., & Lugtig, P. (2022). Representativeness in six waves of CROss-National Online Survey (CRONOS) panel. Journal of the Royal Statistical Society: Series A (Statistics in Society), 185(3), 851-871. https://doi.org/10.1111/rssa.12801 Go to original source...
  76. McMillen, R. C., Winickoff, J. P., Wilson, K., Tanski, S., & Klein, J. D. (2015). A dual-frame sampling methodology to address landline replacement in tobacco control research. Tobacco Control, 24(1), e10-e15. https://doi.org/10.1136/tobaccocontrol-2013-051436 Go to original source...
  77. Mercer, A., & Lau, A. (2023). Comparing two types of online survey samples. Pew Research Center. https://www.pewresearch.org/methods/2023/09/07/comparing-two-types-of-online-survey-samples/
  78. Migliavaca, C. B., Stein, C., Colpani, V., Barker, T. H., Ziegelmann, P. K., Munn, Z., & Falavigna, M. (2022). Meta-analysis of prevalence: I<sup>2</sup> statistic and how to deal with heterogeneity. Research Synthesis Methods, 13(3), 363-367. https://doi.org/10.1002/jrsm.1547 Go to original source...
  79. Nayak, M. S. D. P., & Narayan, K. A. (2019). Strengths and weaknesses of online surveys. IOSR Journal of Humanities and Social Sciences, 24(5), 31-38.
  80. Ormston, R., Martin, C., Rogers, L., Huskinson, T., Irvin, E., Rimmington, E., & Lynn, P. (2024). Financial and resource implications. In Long term survey strategy: Mixed mode research report (Chap. 10). Scottish Government, Chief Statistician, Digital Directorate. https://www.gov.scot/publications/mixed-mode-research-report-inform-scottish-government-long-term-survey-strategy/
  81. Page, M. J., Higgins, J. P. T., & Sterne, J. A. C. (2024). Chapter 13: Assessing risk of bias due to missing results in a synthesis. In J. P. T. Higgins, J. Thomas, J. Chandler, M. Cumpston, T. Li, M. J. Page, & V. A. Welch (Eds.), Cochrane handbook for systematic reviews of interventions (Version 6.5, updated August 2024). Cochrane. https://www.training.cochrane.org/handbook
  82. Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., McGuinness, L. A., Stewart, L. A., Thomas, J., Tricco, A. C., Welch, V. A., Whiting, P., & Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ, 372, n71. https://doi.org/10.1136/bmj.n71 Go to original source...
  83. Pang, J., & Capek, J. (2020). Factors influencing researcher cooperation in virtual academic communities based on principal component analysis. Acta Informatica Pragensia, 9(1), 4-17. https://doi.org/10.18267/j.aip.128 Go to original source...
  84. Pennay, D. W., Neiger, D., Lavrakas, P. J., & Borg, K. (2018). The Online Panels Benchmarking Study: A total survey error comparison of findings from probability-based surveys and non-probability online panel surveys in Australia (CSRM Methods Series No. 2/2018). Centre for Social Research and Methods.
  85. R Core Team. (2020). R: A language and environment for statistical computing. R Foundation for Statistical Computing. https://www.R-project.org/
  86. Räsänen, P., Oksanen, A., Lehdonvirta, V., & Blank, G. (2024). Social media, web, and panel surveys: Using non-probability samples to study population characteristics. In Advanced research methods for applied psychology: Design, analysis and reporting (2nd ed., pp. 140-152). Routledge. https://doi.org/10.4324/9781003362715-13 Go to original source...
  87. Rasinski, K. A., Lee, L., & Krishnamurty, P. (2012). Question order effects. In H. Cooper (Ed.-in-Chief), P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology: Vol. 1. Foundations, planning, measures, and psychometrics (pp. 229-248). American Psychological Association. https://doi.org/10.1037/13619-014 Go to original source...
  88. Raudenbush, S. W. (2009). Analyzing effect sizes: Random-effects models. In H. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (pp. 295-315). Russell Sage Foundation.
  89. Rebenok, V., Rozhi, I., Petro, Y., Kozub, H., & Diachenko, N. (2024). Evolving information landscape: ICT's influence on societal digitalisation. Multidisciplinary Science Journal, 6, 2024ss0706. https://doi.org/10.31893/multiscience.2024ss0706 Go to original source...
  90. Rehm, J., Patra, J., Brennan, A., Buckley, C., Greenfield, T. K., Kerr, W. C., Manthey, J., Purshouse, R. C., Rovira, P., Shuper, P. A., & Shield, K. D. (2021). The role of alcohol use in the aetiology and progression of liver disease: A narrative review and a quantification. Drug and Alcohol Review, 40(5), 638-646. https://doi.org/10.1111/dar.13286 Go to original source...
  91. Revilla, M. (2013). Measurement invariance and quality of composite scores in a face-to-face and a web survey. Survey Research Methods, 7(1), 17-28. https://doi.org/10.18148/srm/2013.v7i1.5098 Go to original source...
  92. Ryan, K. L., Taylor, S. M., Lyle, J. M., Stark, K. E., & Tracey, S. R. (2024). On the Line and Online: Higher Non-Response to Web-Based Surveys Over-Represents Avid Recreational Fishers Compared With Telephone Surveys. Fisheries Management and Ecology, 32(3). https://doi.org/10.1111/fme.12752 Go to original source...
  93. Sakshaug, J. W., Wi¶niowski, A., Perez Ruiz, D. A., & Blom, A. G. (2019). Supplementing small probability samples with nonprobability samples: A Bayesian approach. Journal of Official Statistics, 35(3), 653-681. https://doi.org/10.2478/JOS-2019-0027 Go to original source...
  94. Saris, W. E., & Gallhofer, I. N. (2007). Design, evaluation, and analysis of questionnaires for survey research. Wiley. https://doi.org/10.1002/9780470165195 Go to original source...
  95. Scherpenzeel, A. (2011). Data collection in a probability-based Internet panel: How the LISS panel was built and how it can be used. Bulletin de Méthodologie Sociologique, 109(1), 56-61. https://doi.org/10.1177/0759106310387713 Go to original source...
  96. Scherpenzeel, A. C., & Bethlehem, J. G. (2011). How representative are online panels? Problems of coverage and selection and possible solutions. In M. Das, P. Ester, & L. Kaczmirek (Eds.), Social and behavioral research and the Internet: Advances in applied methods and research strategies (pp. 105-132). Routledge/Taylor & Francis Group. Go to original source...
  97. Schonlau, M., van Soest, A., & Kapteyn, A. (2007). Are 'webographic' or attitudinal questions useful for adjusting estimates from web surveys using propensity scoring? Survey Research Methods, 1(3), 155-163. https://doi.org/10.18148/srm/2007.v1i3.70 Go to original source...
  98. Schwarz, N., Knäuper, B., & Oyserman, D. (2008). The psychology of asking questions. In E. de Leeuw, J. Hox, & D. Dillman (Eds.), International handbook of survey methodology (pp. 18-34). Taylor & Francis.
  99. Selman, C. J., Lee, K. J., Whitehead, C. L., Manley, B. J., & Mahar, R. K. (2023). Statistical analyses of ordinal outcomes in randomised controlled trials: Protocol for a scoping review. Trials, 24, Article 72. https://doi.org/10.1186/s13063-023-07262-8 Go to original source...
  100. Seol, D.-H., Jang, D.-H., & LoCascio, S. P. (2023). RDD with follow-up texting: A new attempt to build a probability-based online panel in South Korea. Asian Journal for Public Opinion Research, 11(3), 257-273. https://doi.org/10.15206/ajpor.2023.11.3.257 Go to original source...
  101. Smith, T. W. (2003). An experimental comparison of Knowledge Networks and the GSS. International Journal of Public Opinion Research, 15(2), 167-179. https://doi.org/10.1093/ijpor/15.2.167 Go to original source...
  102. Smith, T. W., & Dennis, J. M. (2004). Comparing the Knowledge Networks web-enabled panel and the in-person 2002 General Social Survey: Experiments with mode, format, and question wordings (GSS Methodological Report No. 99). National Opinion Research Center, University of Chicago. https://gss.norc.org/content/dam/gss/get-documentation/pdf/reports/methodological-reports/MR99%20Comparing%20the%20Knowledge%20Networks%20Web-Enabled%20Panel%20and%20the%20In-Person%202002%20GSS.pdf
  103. Spijkerman, R., Knibbe, R., Knoops, K., Van de Mheen, D., & Van den Eijnden, R. (2009). The utility of online panel surveys versus computer-assisted interviews in obtaining substance-use prevalence estimates in the Netherlands. Addiction, 104(10), 1641-1645. https://doi.org/10.1111/j.1360-0443.2009.02642.x Go to original source...
  104. Stadtmüller, S., Silber, H., Gummer, T., Sand, M., Zins, S., Beuthner, C., & Christmann, P. (2023). Evaluating an alternative frame for address-based sampling in Germany: The address database from Deutsche Post Direkt. Methods, Data, Analyses, 17(1), 29-46. https://doi.org/10.12758/mda.2022.06 Go to original source...
  105. Stahl, B. C., Timmermans, J., & Flick, C. (2017). Ethics of emerging information and communication technologies: On the implementation of responsible research and innovation. Science and Public Policy, 44(3), 369-381. https://doi.org/10.1093/scipol/scw069 Go to original source...
  106. Stanley, M., Roycroft, J., Amaya, A., Dever, J. A., & Srivastav, A. (2020). The effectiveness of incentives on completion rates, data quality, and nonresponse bias in a probability-based internet panel survey. Field Methods, 32(2), 159-179. https://doi.org/10.1177/1525822x20901802 Go to original source...
  107. Struminskaya, B. (2014). Data quality in probability-based online panels: Nonresponse, attrition, and panel conditioning. Doctoral dissertation, Utrecht University. https://dspace.library.uu.nl/bitstream/1874/301751/3/struminskaya.pdf
  108. Struminskaya, B., de Leeuw, E., & Kaczmirek, L. (2016). Mode system effects in an online panel study: Comparing a probability-based online panel with two face-to-face reference surveys. Methods, Data, Analyses, 9(1), 3-56. https://doi.org/10.12758/mda.2015.001 Go to original source...
  109. Survey Quality Predictor. (2017). SQP coding instructions. Universitat Pompeu Fabra. http://sqp.upf.edu/media/files/sqp_coding_instructions.pdf
  110. Symbaluk, D., & Hall, R. (2024). Surveys. In Research methods: Exploring the social world in Canadian context (Chap. 7). MacEwan University. Go to original source...
  111. Tourangeau, R., & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133(5), 859-883. https://doi.org/10.1037/0033-2909.133.5.859 Go to original source...
  112. Tourangeau, R., Conrad, F. G., & Couper, M. P. (2013). Introduction. In The science of web surveys (pp. 1-10). Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199747047.003.0001 Go to original source...
  113. Unangst, J. J., Amaya, A. E., Sanders, H. L. P. S., Howard-Doering, J., Ferrell, A. R., Karon, S. L., & Dever, J. A. (2020). A process for decomposing total survey error in probability and nonprobability surveys: A case study comparing health statistics in US internet panels. Journal of Survey Statistics and Methodology, 8(1), 62-88. https://doi.org/10.1093/jssam/smz040 Go to original source...
  114. Vaithianathan, R., Hool, B., Hurd, M. D., & Rohwedder, S. (2021). High-frequency internet survey of a probability sample of older Singaporeans: The Singapore Life Panel. Singapore Economic Review, 66(6), 1759-1778. Go to original source...
  115. van der Schyff, K., Foster, G., Renaud, K., & Flowerday, S. (2023). Online privacy fatigue: A scoping review and research agenda. Future Internet, 15(5), Article 164. https://doi.org/10.3390/fi15050164 Go to original source...
  116. Vehovar, V., & Beullens, K. (2017). Cross-national issues in response rates. In D. L. Vannette & J. A. Krosnick (Eds.), The Palgrave Handbook of Survey Research (pp. 29-42). Springer International Publishing. https://doi.org/10.1007/978-3-319-54395-6_5 Go to original source...
  117. Vehovar, V., & Lozar Manfreda, K. (2017). Overview: Online surveys. In N. G. Fielding, R. M. Lee, & G. Blank (Eds.), The SAGE handbook of online research methods (pp. 143-161). SAGE. Go to original source...
  118. Vehovar, V., Batagelj, Z., Lozar Manfreda, K., & Zaletel, M. (2002). Nonresponse in web surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey nonresponse (pp. 229-242). Wiley.
  119. Vehovar, V., Smutny, Z., & Bartol, J. (2022). Evolution of social informatics: Publications, research, and educational activities. The Information Society, 38(5), 307-333. https://doi.org/10.1080/01972243.2022.2092570 Go to original source...
  120. Vehovar, V., Toepoel, V., & Steinmetz, S. (2016). Non-probability sampling. In C. Wolf, D. Joye, T. W. Smith, & Y.-C. Fu (Eds.), The SAGE handbook of survey methodology (pp. 329-345). SAGE. Go to original source...
  121. Verhulst, B., & Neale, M. C. (2021). Best practices for binary and ordinal data analyses. Behavior Genetics, 51(3), 204-214. https://doi.org/10.1007/s10519-020-10031-x Go to original source...
  122. Viechtbauer, W. (2005). Bias and efficiency of meta-analytic variance estimators in the random-effects model. Journal of Educational and Behavioral Statistics, 30(3), 261-293. https://doi.org/10.3102/10769986030003261 Go to original source...
  123. Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36(3), 1-48. https://doi.org/10.18637/jss.v036.i03 Go to original source...
  124. Vojíø, S., & Kuèera, J. (2021). Towards re-decentralized future of the web: Privacy, security and technology development. Acta Informatica Pragensia, 10(3), 349-369. https://doi.org/10.18267/j.aip.169 Go to original source...
  125. Westland, H., Vervoort, S., Kars, M., & Jaarsma, T. (2025). Interviewing people on sensitive topics: Challenges and strategies. European Journal of Cardiovascular Nursing, 24(3), 488-493. https://doi.org/10.1093/eurjcn/zvae128 Go to original source...
  126. Williams, M. S., Ebel, E. D., & Wagner, B. A. (2007). Monte Carlo approaches for determining power and sample size in low-prevalence applications. Preventive Veterinary Medicine, 82(1-2), 151-158. https://doi.org/10.1016/j.prevetmed.2007.05.015 Go to original source...
  127. Yeager, D. S., Krosnick, J. A., Chang, L., Javitz, H. S., Levendusky, M. S., Simpser, A., & Wang, R. (2011). Comparing the accuracy of RDD telephone surveys and internet surveys conducted with probability and non-probability samples. Public Opinion Quarterly, 75(4), 709-747. https://doi.org/10.1093/poq/nfr020 Go to original source...

This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits use, distribution, and reproduction in any medium, provided the original publication is properly cited. No use, distribution or reproduction is permitted which does not comply with these terms.