In 1944, Goldman published a report on a “poll on the polls” (Goldman, 1944). He felt this study was necessary because polls were established through a claim of methodological rigor as a scientific method but were facing “serious and widespread criticism” (Goldman, 1944: 461). He used the method of a poll on the polls to understand how the American public viewed polls. After reporting the results, his conclusion was “a striking vote of confidence,” which was based on knowledge about polls, positive attitudes toward their value, and trust in their results (Goldman, 1944: 467). At that time, the main criticism about polls was not based on their methodology, but rather on whether those conducting the polls had the intention of producing accurate results, whether polls were interpreted fairly, and whether polls contributed to the functioning of democracy (Goldman, 1944).
Almost 80 years later, after a period which featured an enormous expansion of survey-based research, a group of American survey researchers and polling practitioners published a joint article in which they state twelve recommendations to protect the “integrity of survey research” (Jamieson et al., 2023). Their article makes clear that the criticism of polls and surveys has shifted toward the methodology itself. Additionally, Kim et al. (2011: 165) documented a “markedly negative shift in attitudes toward public opinion researchers and polls across several dimensions between the mid-1990s and the first decade of the 2000s.”
Taken together, these developments should not come as a surprise when Tourangeau, stated that “[i]t seems obvious ‘to the ear’ that the survey and polling business is in crisis” (Tourangeau, 2017: 803). Similarly, in 2018, Johnson recognized that survey research and social statistics at large have become a target of a societal delegitimization process (Johnson, 2018). In fact, there are many reasons why we as a profession should be concerned about the current survey climate. Those reasons include over-surveying and survey fatigue (Leeper, 2019; Sinickas, 2007; Weiner & Dalessio, 2006), declining response rates (Keeter, 2018; Schoeni et al., 2013; Tourangeau, 2017), sociodemographic bias (Stein et al., 2025; Tolonen et al., 2006), the problem of professional respondents (Hillygus et al., 2014; Matthijsse et al., 2015; Silber et al., 2023), failures of polls to predict elections results (Durand et al., 2001; Sturgis et al., 2018), new privacy regulations (Bauer et al., 2022), increasing costs of face-to-face surveys (Tourangeau, 2017; Wolf et al., 2021), and the widespread usage and publication of the results from non-probability surveys without adequate descriptions of their limitations (Cornesse et al. 2020; Jerit & Barabas, 2023; Kohler, 2019; Kohler & Post, 2023) to only name a few obvious challenges that can affect the accuracy and usefulness of surveys. Recent research has also identified conditions under which members of the public have lower credibility assessments of poll results that disagree with their own prior beliefs, attitudes, or candidate preferences (Johnson et al., 2024; Kuru et al., 2017; 2020a; 2020b; Madson & Hillygus 2020).
The term “survey climate” to assess a survey’s role in society was coined by Lyberg and Lyberg (1991), who studied nonresponse trends in Sweden. Since then, the term has been broadened and includes “willingness to participate,” “public opinion about surveys,” and “general societal characteristics” (Loosveldt & Joye, 2016). Echoing this encompassing view of surveys and their role in society, recent work from Germany suggested that declining trust in institutions, including trust in science, and lower levels of civic engagement might be directly related to lower willingness to participate in surveys (Silber et al., 2022). Altogether, challenges for scientific surveys arise from multiple fronts, suggesting a pressing need for related research from multiple perspectives.
The research presented in this special issue on the survey climate can be categorized into two distinct groups. The first is research on survey methods, especially related to explaining and documenting survey nonresponse, and the second is research on survey attitudes, especially on public trust in official statistics and in reports of survey results. The seven articles in this special issue are international in scope, covering various countries and regions, that is, Germany, Hungary, the Netherlands, Sweden, the United Kingdom, and the United States.
With respect to the special issue’s contributions regarding survey methods, Lundmark and Backström (2025) present survey results covering three decades (1993 to 2023) in Sweden as well as a comparison with administrative register data. The authors show that increases in nonresponse cannot be explained by alternative factors such as birth cohort replacement or immigration patterns, so they identify declining willingness to participate as the main driver of growing nonparticipation.
A second contribution, by Klingwort and Toepoel (2025), uses a meta-analytic approach for the area of crime research conducted between 2001 and 2021 in Germany. The authors examine the influence of several survey design features on response rates. This approach allows them to explain a large proportion of the heterogeneity across studies. Specifically, surveys conducted in later years exhibit lower response rates, the Computer-Assisted Telephone Interview (CATI) mode yields lower response rates compared to other survey modes, and surveys conducted by universities tend to have higher response rates than those conducted by ministries.
Third, Rosche et al. (2025) used data from the LISS Panel in the Netherlands to analyze unit nonresponse and attrition. The authors were able to make use of the survey attitude scale (De Leeuw et al. 2019), which was annually collected in the LISS Panel between 2008 and 2013. The authors first establish that the survey attitude scale captures both variation and stability of attitudes across waves with about two-thirds of the variance being related to stable patterns. In a second step, they compared the predictive power of the survey attitude scale to respondents’ psychographic and sociodemographic profiles, showing that while the profiles performed better in predicting unit nonresponse, the survey attitudes were superior in predicting panel attrition.
Fourth, Herold et al. (2025) used German data from the Survey of Health, Ageing and Retirement in Europe (SHARE), a multidisciplinary and cross-national panel database, to investigate item nonresponse to the income question and agreement of respondents to link their survey data to external data obtained by the German pension insurance. As explanatory variables, the authors were able to incorporate additional questions on trust in institutions that conduct surveys, survey-related privacy concerns, attitudes toward surveys, as well as attitudes toward revealing someone’s personal income in a paper-and-pencil drop-off questionnaire. The authors found that concerns about data privacy and negative attitudes toward revealing income were positively associated with item nonresponse. They also discovered that lower levels of trust in institutions that conduct surveys, data privacy concerns, and perceptions of survey burden reduced linkage consent rates.
Turning to the special issue’s contributions regarding survey attitudes, Maslovskaya and Bianchi (2025) documented an increase in trust in official statistics in the UK between 2014 and 2021. However, at least some of this increase may have been due to higher levels of unit nonresponse in later years. Factors that may have influenced trust in official statistics positively in 2021 might be the UK Census, which was also conducted in the same year, and frequent reporting of official health statistics in the context of the COVID-19 pandemic. The authors also showed lower trust levels of older and less educated respondents compared to their younger and more educated counterparts.
Investigating trust in survey result reports, Stefkovics and Kmetty (2025) conducted a cross-cultural replication and extension of a study by Stadtmüller et al. (2022). Using data collected in Hungary and the United States, the authors implemented vignette experiments of survey result reports with respect to two topics: commuter allowances and immigration in which survey quality information, sponsors, and the studies’ results were systematically varied. The study’s findings indicate that communicating to respondents that the sample size was larger enhanced trust, and individuals with higher levels of education attribute more importance to methodological information, regardless of the topic, in both countries. Furthermore, the study revealed that the degree of respondents’ political polarization affected how they interpreted survey results, especially those that supported their own views, which also interacted with the study’s sponsor.
The survey attitude section features a second study on trust in survey result reports by Holbrook et al. (2025), which is based on three vignette experiments conducted in different telephone interviews in the state of Ohio in the United States. In this contribution, the authors developed a dual processing model of survey results reports, which suggests that individuals evaluate more methodologically rigorous surveys only more positively when they are both able and motivated to fulfill the respective response task accurately.
In summary, we view the articles within this special issue as a first step toward depicting crucial aspects of the international survey climate, enabling researchers to respond in an informed manner to the current crisis of scientific surveys. Specifically, the four methodological contributions included systematic approaches using meta-analysis, long time trends data, administrative data, and repeated survey attitude measures to better understand and predict survey nonresponse. Moreover, the three articles included in the attitudinal section provide new insights into the development of trust in official statistics and the processing of reports of survey results, especially with respect to the underlying data collection methods of reported survey results.
This topical collection of original articles should serve as a starting point on a continuing discussion about survey climate and trust in scientific surveys. We believe that there is a clear demand for more comparative research to understand (country-level) characteristics that might be affecting perceptions toward surveys and trust in official statistics. Also, the consequences of these issues for sustained funding of data collections by policymakers and elected officials must be considered. With increasing costs and decreasing response rates, conducting high quality scientific surveys becomes more and more challenging. Innovative strategies to uphold methodological rigor while mitigating the challenging environment need to be developed. Until now, one of the most important initiatives in this context might be AAPOR’s Transparency Initiative, which was established in 2014 and encompasses a comprehensive list of methodological procedures that research organizations should disclose when reporting survey-based findings (AAPOR, 2021).
As a precursor to this special issue, a workshop on the international survey climate was held at the University of Kassel in Germany (October 4–5, 2022), supported by the German Academy of Sociology and the University of Kassel. Among the thirteen presentations, was one by Don Dillman and colleagues who spoke about “How the Changing Nature of Surveys May Be Affecting Survey Climate and Trust.” Dillman passed away in June 2024 and he will be deeply missed. It was clear that the topic of the international survey climate and trust in surveys was close to his heart. We hope that the articles included in this special issue will contribute to broad discussion around this topic.
AAPOR (2021). Transparency initiative disclosure elements. Available from: American Association for Public Opinion Research. https://aapor.org/wp-content/uploads/2022/11/TI-Attachment-C.pdf →
Bauer, P. C., Gerdon, F., Keusch, F., Kreuter, F., & Vannette, D. (2022). Did the GDPR increase trust in data collectors? Evidence from observational and experimental data. Information. Communication & Society, 25(14), 2101–2121. →
Cornesse, C., Blom, A. G., Dutwin, D., Krosnick, J. A., De Leeuw, E. D., Legleye, S., Pasek, J., Pennay, D., Phillips, B., Sakshaug, J. W., & Struminskaya, B. (2020). A review of conceptual approaches and empirical evidence on probability and nonprobability sample survey research. Journal of Survey Statistics and Methodology, 8(1), 4–36. →
Durand, C., Blais, A., & Vachon, S. (2001). A late campaign swing or a failure of the polls? The case of the 1998 Quebec election. Public Opinion Quarterly, 65(1), 108–123. →
Goldman, E. F. (1944). Poll on the polls. Public Opinion Quarterly, 8(4), 461–467. a, b, c, d
Herold, I., Bergmann, M., & Bethmann, A. (2025). Trust, concerns and attitudes: examples for respondent (non-)cooperation in SHARE. Survey Research Methods. https://doi.org/10.18148/srm/2025.v19i2.8274 →
Hillygus, D. S., Jackson, N., & Young, M. (2014). Professional respondents in non-probability online panels. In M. Callegaro, R. Baker, J. Bethlehem, A. S. Goritz, J. A. Krosnick & P. J. Lavrakas (Eds.), Online panel research: a data quality perspective (pp. 219–237). Chichester: John Wiley & Sons. →
Holbrook, A., Lavrakas, P. J., Johnson, T. P., Crosby, A., Polskaia, P., Wang, X., Hu, X., Kapousouz, E., Cho, Y. I., & Silber, H. (2025). Using Experimental Vignettes in a Telephone Survey to Study how Survey Methods and Findings Affect the Public’s Evaluation of Public Opinion Polls: Considering A Dual Process Approach. Survey Research Methods. https://doi.org/10.18148/srm/2025.v19i2.8261 →
Jamieson, K. H., Lupia, A., Amaya, A., Brady, H. E., Bautista, R., Clinton, J. D., Denver, J. A., Dutwin, D., Goroff, D. L., Hillygus, D. S., Kennedy, C., Langer, G., Lapinski, J. S., Link, M., Philpot, T., Prewitt, K., Rivers, D., Vavreck, L., Wilson, D. C., & McNutt, M. K. (2023). Protecting the integrity of survey research. PNAS nexus, 2(3), pgad49. →
Jerit, J., & Barabas, J. (2023). Are nonprobability surveys fit for purpose? Public Opinion Quarterly, 87(3), 816–840. →
Johnson, T. P. (2018). Presidential address: legitimacy, wicked problems, and public opinion research. Public Opinion Quarterly, 82(3), 614–621. →
Johnson, T. P., Silber, H., & Darling, J. E. (2024). Public perceptions of pollsters in the United States: experimental evidence. Social Science Quarterly, 105(1), 114–127. →
Keeter, S. (2018). Evidence about the accuracy of surveys in the face of declining response rates. In D. L. Vannette & J. A. Krosnick (Eds.), Palgrave handbook of survey research (pp. 19–22). London: Palgrave Macmillan. →
Kim, J., Gershenson, C., Glaser, P., & Smith, T. W. (2011). The polls—trends: trends in surveys on surveys. Public Opinion Quarterly, 75(1), 165–191. →
Klingwort, J., & Toepoel, V. (2025). Effects of survey design features on response rates: a meta-analytical approach using the example of crime surveys. Survey Research Methods. https://doi.org/10.18148/srm/2025.v19i2.8173 →
Kohler, U. (2019). Possible uses of Nonprobability sampling for the social sciences. Survey methods: insights from the field. https://surveyinsights.org/?p=10981 →
Kohler, U., & Post, J. C. (2023). Welcher Zweck Heiligt die Mittel? Bemerkungen Zur Repräsentativitätsdebatte in der Meinungsforschung. Zeitschrift für Soziologie, 52(1), 67–88. →
Kuru, O., Pasek, J., & Traugott, M. W. (2017). Motivated reasoning in the perceived credibility of public opinion polls. Public Opinion Quarterly, 81(2), 422–446. →
Kuru, O., Pasek, J., & Traugott, M. W. (2020a). When polls disagree: how competitive results and methodological quality shape partisan perceptions of polls and electoral predictions. International Journal of Public Opinion Research, 32(3), 586–603. →
Kuru, O., Pasek, J., & Traugott, M. W. (2020b). When pundits weigh in: do expert and partisan critiques in news reports shape ordinary individuals’ interpretations of polls? Mass Communication and Society, 23(5), 628–655. →
Leeper, T. J. (2019). Where have the respondents gone? Perhaps we ate them all. Public Opinion Quarterly, 83(S1), 280–288. →
de Leeuw, E., Hox, J., Silber, H., Struminskaya, B., & Vis, C. (2019). Development of an international survey attitude scale: measurement equivalence, reliability, and predictive validity. Measurement Instruments for the Social Sciences. https://doi.org/10.1186/s42409-019-0012-x. →
Loosveldt, G., & Joye, D. (2016). Defining and assessing survey climate. In C. Wolf, D. Joye, T. W. Smith & Y. Fu (Eds.), The Sage handbook of survey methodology (pp. 67–76). Los Angeles: SAGE. →
Lundmark, S., & Backström, K. (2025). Predicting survey nonresponse with registry data in Sweden between 1993 to 2023: cohort replacement or a deteriorating survey climate? Survey Research Methods. https://doi.org/10.18148/srm/2025.v19i2.8278 →
Lyberg, I., & Lyberg, L. (1991). Nonresponse research at Statistics Sweden. In Proceedings of the survey research methods section of the American Statistical Association (pp. 78–87). →
Madson, G. J., & Hillygus, D. S. (2020). All the best polls agree with me: Bias in evaluations of political polling. Political Behavior, 42(4), 1055–1072. →
Maslovskaya, O., & Bianchi, A. (2025). Public confidence in official statistics in the United Kingdom: characteristics of respondents and changes over time. Survey Research Methods. https://doi.org/10.18148/srm/2025.v19i3.8284 →
Matthijsse, S. M., De Leeuw, E. D., & Hox, J. J. (2015). Internet panels, professional respondents, and data quality. Methodology: European Journal of Research Methods for the Behavioral and Social Sciences, 11(3), 81–88. →
Rosche, B., Bons, H., Hox, J., & de Leeuw, E. (2025). The survey attitude scale as an indicator of survey attitude and a predictor of nonresponse and panel dropout. Survey Research Methods. https://doi.org/10.18148/srm/2025.v19i3.8284 →
Schoeni, R. F., Stafford, F., McGonagle, K. A., & Andreski, P. (2013). Response rates in national panel surveys. The ANNALS of the American academy of political and social science, 645(1), 60–87. →
Silber, H., Moy, P., Johnson, T. P., Neumann, R., Stadtmüller, S., & Repke, L. (2022). Survey participation as a function of democratic engagement, trust in institutions, and perceptions of surveys. Social Science Quarterly, 103(7), 1619–1632. →
Silber, H., Stadtmüller, S., & Cernat, A. (2023). Comparing participation motives of professional and non-professional respondents. International Journal of Market Research, 65(4), 361–372. →
Sinickas, A. (2007). Finding a cure for survey fatigue. Strategic Communication Management, 11(2), 11. →
Stadtmüller, S., Silber, H., & Beuthner, C. (2022). What influences trust in survey results? Evidence from a vignette experiment. International Journal of Public Opinion Research, 34(2), edac12. →
Stefkovics, A., & Kmetty, Z. (2025). Trust in survey results. A cross-country replication experiment. Survey Research Methods. https://doi.org/10.18148/srm/2025.v19i2.8267 →
Stein, A., Gummer, T., Naumann, E., Rohr, B., Silber, H., Auriga, R., Bergmann, M., Bethmann, A., Blohm, M., Cornesse, C., Christmann, C., Coban, M., Décieux, J. P., Gauly, B., Hahn, C., Helmschrott, S., Hochman, O., Lemcke, J., Naber, D., Pötzschke, S., Roßmann, J., Schanze, J.-L., Schmidt, T., Schneider, S. L., Spangenberg, H., Rettig, T., Trappmann, M., Weinhardt, M., & Weiß, B. (2025). Education bias in probability-based surveys in Germany: evidence and possible solutions. International Journal of Social Research Methodology. https://doi.org/10.1080/13645579.2025.2508889. →
Sturgis, P., Kuha, J., Baker, N., Callegaro, M., Fisher, S., Green, J., Jennings, W., Lauderdale, B. E., & Smith, P. (2018). An assessment of the causes of the errors in the 2015 UK general election opinion polls. Journal of the Royal Statistical Society. Series A (Statistics in Society), 181(3), 757–781. →
Tolonen, H., Helakorpi, S., Talala, K., Helasoja, V., Martelin, T., & Prättälä, R. (2006). 25-year trends and socio-demographic differences in response rates: Finnish adult health behaviour survey. European Journal of Epidemiology, 21(6), 409–415. →
Tourangeau, R. (2017). Presidential address: paradoxes of nonresponse. Public Opinion Quarterly, 81(3), 803–814. a, b, c
Weiner, S. P., & Dalessio, A. T. (2006). Oversurveying: causes, consequences, and cures. In A. I. Kraut (Ed.), Getting action from organizational surveys: New concepts, methods, and applications (pp. 294–311). San Francisco: Jossey-Bass. →
Wolf, C., Christmann, P., Gummer, T., Schnaudt, C., & Verhoeven, S. (2021). Conducting general social surveys as self-administered mixed-mode surveys. Public Opinion Quarterly, 85(2), 623–648. →