Survey Research Methods 2022-04-19T13:00:17+00:00 Andre Pirralha Open Journal Systems <p>Survey Research Methods is the official peer-reviewed journal of the European Survey Research Association (ESRA). The journal publishes articles in English, which discuss methodological issues related to survey research.</p> How to enhance web survey data using metered, geolocation, visual and voice data? 2022-04-11T16:13:47+00:00 Melanie Revilla <p><span dir="ltr" style="left: 223.075px; top: 339.355px; font-size: 14.944px; font-family: sans-serif; transform: scaleX(0.92823);" role="presentation">After briefly summarizing why there is a need to enhance web survey data, this paper explains </span><span dir="ltr" style="left: 223.075px; top: 357.62px; font-size: 14.944px; font-family: sans-serif; transform: scaleX(0.950181);" role="presentation">how metered, geolocation, visual and voice data could help to supplement conventional web</span> <span dir="ltr" style="left: 223.075px; top: 375.885px; font-size: 14.944px; font-family: sans-serif; transform: scaleX(0.974882);" role="presentation">survey data, particularly when mobile participation is high. It presents expected benefits of</span> <span dir="ltr" style="left: 223.075px; top: 394.15px; font-size: 14.944px; font-family: sans-serif; transform: scaleX(0.934397);" role="presentation">these four data types in terms of respondents’ burden, data quality and possible new insights,</span> <span dir="ltr" style="left: 223.075px; top: 412.415px; font-size: 14.944px; font-family: sans-serif; transform: scaleX(0.953402);" role="presentation">as well as a number of expected disadvantages, both on the respondents’ and researchers’</span> <span dir="ltr" style="left: 223.075px; top: 430.68px; font-size: 14.944px; font-family: sans-serif; transform: scaleX(0.950181);" role="presentation">sides. Finally, the paper discusses what is still missing and the next steps to turn these new</span><br role="presentation"><span dir="ltr" style="left: 223.075px; top: 448.943px; font-size: 14.944px; font-family: sans-serif; transform: scaleX(0.956238);" role="presentation">opportunities into realities.</span></p> 2022-04-10T00:00:00+00:00 Copyright (c) 2022 Melanie Revilla Nonresponse analysis in a longitudinal smartphone-based travel study 2022-04-19T13:00:17+00:00 Peter Lugtig Katie Roth Barry Schouten <p>Currently, travel surveys are the standard method for measuring mobility in official statistics. Nonresponse and measurement are problematic in travel surveys, due to the high burden and non-centrality of the requested information. To overcome these issues, new methods emerge.</p> <p>The aim of this paper is to assess nonresponse in an experimental travel study carried out in the Netherlands. A smartphone application was developed that passively collects GPS coordinates and automatically populates a travel diary, Participants are then asked for additional information, such as travel mode. &nbsp;</p> <p>In the experiment, respondents from a random sample of the Dutch population participated in a 7-day study that varied how respondents were recruited into the study, and the size and timing of a monetary incentive. We study at what stage of the study respondents choose to participate and dropout, and study nonresponse bias across 13 variables from the Dutch population register in order to understand how selective nonresponse in the different stage of the app-study was. We found that incentive group, age and education were strong predictors of nonresponse. The overall representativity of the study, as expressed in R-indicators and Coefficients of Variation was rather low because of this bias. However, we found the same biases going in opposite directions when we computed R-indicators for an earlier web-based travel-diary study. This implies that in the future, diary studies should focus on methods to successfully combine smartphone apps and diaries through the web or on paper in order to limit nonresponse successfully.</p> 2022-04-10T00:00:00+00:00 Copyright (c) 2021 Peter Lugtig, Katie Roth, Barry Schouten Measurement Equivalence in Sequential Mixed-Mode Surveys 2022-04-11T07:35:35+00:00 Joseph Sakshaug Alexandru Cernat Richard J. Silverwood Lisa Calderwood George B. Ploubidis <p>Many surveys collect data using a mixture of modes administered in sequential order. Although the impacts of mixed-mode designs on measurement quality have been extensively studied, their impacts on the measurement quality of unobservable (or latent) constructs is still an understudied area of research. In particular, it is unclear whether latent constructs derived from multi-item scales are measured equivalently across different sequentially-administered modes – an assumption that is often made by analysts, but rarely tested in practice. In this study, we assess the measurement equivalence of several commonly-used multi-item scales collected in a sequential mixed-mode (Web-telephone-face-to-face) survey: the Age 25 wave of the Next Steps cohort study. After controlling for selection via an extensive data-driven weighting procedure, a multi-group confirmatory factor analysis was performed to assess measurement equivalence across the three modes. We show that cross-mode measurement equivalence is achieved for the majority of scales, with partial equivalence established for the remaining scales. Although measurement equivalence was achieved, some differences in the latent means were observed between the modes. We conclude with a discussion of these findings, their potential causes, and implications for survey practice.</p> 2022-04-10T00:00:00+00:00 Copyright (c) 2021 Joseph Sakshaug, Alexandru Cernat, Richard J. Silverwood, Lisa Calderwood, George B. Ploubidis Non-Compliance with Indirect Questioning Techniques: 2022-04-11T07:35:32+00:00 Thomas Krause Andreas Wahl <p style="-qt-block-indent: 0; text-indent: 0px; margin: 0px;">Indirect questioning techniques are widely discussed and used as methods to avoid or reduce the effects of social desirability in interview situations on sensitive topics. Nevertheless, current evaluation studies suggest that indirect questioning techniques have a bigger compliance problem than evaluation studies based on the "more is-better" principle would suggest. In our study, we investigate the extent to which question compliance problems can be identified for a variant of the Randomized Response Technique, for the Crosswise Model and Triangular Model. By means of an aggregate and an individual level validation, we examine the response patterns of the participants. Contrary to the actual empirical application context of sensitive topics, we use a non-sensitive question that cannot be distorted by social desirability bias. The resulting "same-is-best" rationale differs from most evaluation studies to date, which work according to the "more" or "less-is-better" principle. Our analyses are based on the data of a convenience sample of 1277 students in the form of an online survey experiment. The results suggest that the indirect questioning techniques show substantial weaknesses in terms of compliance and encourage further individual level evaluations.</p> 2022-04-10T00:00:00+00:00 Copyright (c) 2021 Thomas Krause, Andreas Wahl Survey Participation in the Time of Corona 2022-04-11T07:35:27+00:00 Rolf Becker Sara Möser Nora Moser David Glauser <p>The singular effect of a public shutdown – as a result of non-pharmaceutical official orders and arrangements in the course of the COVID-19 pandemic – on survey participation in a panel study carried out in German-speaking cantons of Switzerland is investigated. In this panel context, it is possible to analyse this impact on the response of juveniles born around 1997 by paradata on the timing of the participation in an online survey. Based on these longitudinal data collected in an event-oriented design, competing time-varying effects on survey participation and changes in the pandemic progress are controlled for, in addition to time-constant covariates such as panellists’ individual and social resources. By comparison with the previous panel wave, which took place in the same month two years ago, as well as by dynamic event history analysis within the current field period of the online survey, it becomes obvious that the public shutdown and its related consequences on panellists’ everyday life have improved the propensity of target persons for survey participation during the time interval of public shutdown. This is indeed the case in terms of the timing and the level of response.</p> 2022-04-10T00:00:00+00:00 Copyright (c) 2021 Rolf Becker, Sara Möser, Nora Moser, David Glauser Postscriptum to "Survey Participation in the Time of Corona" 2022-04-11T07:35:17+00:00 Rolf Becker Sara Möser Nora Moser David Glauser <p>The analysis of "Survey Participation in the Time of Corona" is replicated by taking a more recent survey into account that was conducted one year later during the same period. The results clearly indicate that the temporary public shutdown in spring 2020 indeed boosted the panellists’ participation at the initial stage of the survey.</p> 2022-04-10T00:00:00+00:00 Copyright (c) 2022 Rolf Becker Accounting for cross-country-cross-time variations in measurement invariance testing. A case of political participation 2022-04-11T07:35:24+00:00 Piotr Koc Artur Pokropek <p><span class="fontstyle0">"In many works involving measurement invariance testing, researchers concentrate on one type of grouping only, such as countries, even when the comparisons they make involve multiple types of grouping, such as countries and years. In this article, we propose a procedure allowing to incorporate more than one type of grouping into the invariance testing. For that, we use the example of political participation which is often studied in a comparative perspective where both countries and years are considered. The results show that the comparability of levels of political participation in Europe over the last 20 years is limited. With a simulation study, we show that one remedy for this could be alignment optimization which produces more accurate estimates of means and standard errors. Furthermore, we demonstrate that ignoring the non-invariance can change our substantive conclusions regarding the aggregated trends of participation.</span></p> 2022-04-10T00:00:00+00:00 Copyright (c) 2021 Piotr Koc, Artur Pokropek An Evaluation of the quality of interviewer and virtual observations and their value for nonresponse bias reduction 2022-04-11T07:35:37+00:00 Weijia Ren Tom Krenzke Brady West David Cantor <p>With the decline of survey response rates over the past decade, survey researchers need to gather useful auxiliary variables for all sampled units and reduce nonresponse bias through adaptive design or nonresponse weighting adjustments. One potential source of auxiliary information is interviewer observations of characteristics of sampled units. Compared with area-level characteristics, which researchers often have available for reducing bias due to survey nonresponse, characteristics at the dwelling unit level may provide more information about survey variables of interest and result in weight adjustments that could potentially reduce bias further. These observations, however, may vary greatly among observers, and may lack the quality needed for survey data producers. To investigate the quality and usefulness of such observations, this study assesses&nbsp;completeness, validity, interviewer variance, and predictive power for bias reduction&nbsp;in a national pilot study for both in-person interviewer observations and virtual observations. This paper sheds light on the dwelling unit characteristics that are harder to observe, differences among interviewer and virtual observations, the potential value added beyond area-level characteristics for nonresponse adjustments, and ways to improve the observations.</p> 2022-04-10T00:00:00+00:00 Copyright (c) 2021 Weijia Ren, Tom Krenzke, Brady West, David Cantor