Survey Research Methods <p>Survey Research Methods is the official peer-reviewed journal of the European Survey Research Association (ESRA). The journal publishes articles in English, which discuss methodological issues related to survey research.</p> en-US <p><a href="" target="_blank" rel="noopener">Copyright Notice</a></p> (Andre Pirralha) (Publication Officer) Fri, 09 Apr 2021 15:24:21 +0000 OJS 60 The Role of Time, Weather and Google Trends in Understanding and Predicting Web Survey Response <p>In the literature about web survey methodology, significant efforts have been made to understand the role of time-invariant factors (e.g. gender, education and marital status) in (non-)response mechanisms. Time-invariant factors alone, however, cannot account for most variations in (non-)responses, especially fluctuations of response rates over time. This observation inspires us to investigate the counterpart of time-invariant factors, namely time-varying factors and the potential role they play in web survey (non-)response. Specifically, we study the effects of time, weather and societal trends (derived from Google Trends data) on the daily (non-)response patterns of the 2016 and 2017 Dutch Health Surveys. Using discrete-time survival analysis, we find, among others, that weekends, holidays, pleasant weather, disease outbreaks and terrorism salience are associated with fewer responses. Furthermore, we show that using these variables alone achieves satisfactory prediction accuracy of both daily and cumulative response rates when the trained model is applied to future unseen data. This approach has the further benefit of requiring only non-personal contextual information and thus involving no privacy issues. We discuss the implications of the study for survey research and data collection.</p> Qixiang Fang, Joep Burger, Ralph Meijers, Kees van Berkel Copyright (c) 2021 Qixiang Fang, Joep Burger, Ralph Meijers, Kees van Berkel Sat, 10 Apr 2021 00:00:00 +0000 Have You Ever Seen the Rain? It Looks Like It's Going to Rain! <p>This empirical study examines, whether the weather situations during the different seasons in which panel surveys are carried out have an impact on the timing and extent of survey participation. Based on considerations regarding the panellists’ habits and their assessment of a participation's benefits and costs compared to alternative action, it is assumed that ‘pleasant’ weather diverts them from immediately completing the questionnaire while ‘unpleasant’ weather results in a higher degree of participation right after survey launch. The results of event history analysis based on longitudinal data from a multi-wave panel confirm these assumptions. Additionally, there seems to be an interaction between the season and the weather situation: ‘Pleasant’ weather in spring results in a lower participation rate compared to surveys in summer while, given the same weather situation, the participation rate is higher in autumn. Finally, it is evident that regardless of the season, heavy rainfall at the beginning of the field period is most beneficial for conducting an online survey in terms of both rapid response and high participation rates.</p> Rolf Becker Copyright (c) 2021 Rolf Becker Sat, 10 Apr 2021 00:00:00 +0000 Studying the Context Effect of Family Norms on Gender Role Attitudes: an Experimental Design <p>The measurement of gender role attitudes has been found to be problematic in previous studies, especially in comparative perspective. The present study adopts a novel approach and investigates the position of the gender role attitudes scale in the questionnaire as a potential source of bias. In particular, the present study aims at assessing the context effect of the family norms question on the measurement of gender role attitudes by adopting the theoretical perspective of the construal model of attitudes, according to which the adjacent questions constitute the context for interpreting and answering a <em>stimulus</em>. The study employs data from the CROss-National Online Survey panel, which was fielded in 2017 and contained an experiment where the order of the questions under investigation varied. The reliability, validity and invariance of the measurement of gender role attitudes across experimental settings and countries (Estonia, Great Britain and Slovenia) are explored adopting several analytical techniques, such as regression models and multiple-group confirmatory factor analysis. Differences between experimental settings emerged, suggesting that the questionnaire context matters for the validity and stability of the gender role attitudes items; however, the lack of patterns hinders general conclusions on what is the order of questions yielding better measurement of the gender role attitudes scale. Clear differences among the countries indicate that the cultural context may interact with the question context. Finally, we stress that the measurement is overall poor, urging to find a better formulation of the items measuring gender role attitudes.</p> Angelica Maineri, Vera Lomazzi, Ruud Luijkx Copyright (c) 2021 Angelica Maineri, Vera Lomazzi, Ruud Luijkx Sat, 10 Apr 2021 00:00:00 +0000 The Relationship Between Response Probabilities and Data Quality in Grid Questions <p>Response probabilities are used in adaptive and responsive survey designs to guide data collection efforts, often with the goal of diversifying the sample composition. However, if response probabilities are also correlated with measurement error, this approach could introduce bias into survey data. This study analyzes the relationship between response probabilities and data quality in grid questions. Drawing on data from the probability-based GESIS panel, we found low propensity cases to more frequently produce item nonresponse and nondifferentiated answers than high propensity cases. However, this effect was observed only among long-time respondents, not among those who joined more recently. We caution that using adaptive or responsive techniques may increase measurement error while reducing the risk of nonresponse bias.</p> Tobias Gummer, Ruben Bach, Jessica Daikeler, Stephanie Eckman Copyright (c) 2021 Tobias Gummer, Ruben Bach, Jessica Daikeler, Stephanie Eckman Sat, 10 Apr 2021 00:00:00 +0000 Using Response Times to Enhance the Reliability of Political Knowledge Items: An Application to the 2015 Swiss Post-Election Survey <p>In this article, I consider the problem of "cheating" in political knowledge tests. This problem has been made more pressing by the transition of many surveys to online interviewing, opening up the possibility of looking up the correct answers on the internet. Several methods have been proposed to deal with cheating ex-ante, including self-reports of cheating, control for internet browsing, or time limits. Against this background, “response times” (RTs, i.e., the time taken by respondents to answer a survey question) suggest themselves as a post-hoc, unobtrusive means of detecting cheating. In this paper, I propose a procedure for measuring individual-specific and item-specific RTs, which are then used to identify unusually long but correct answers to knowledge questions as potential cases of cheating. I apply this procedure to the post-electoral survey for the 2015 Swiss national elections. My analysis suggests that extremely slow responses to two out of four questions are definitely suspicious. Accordingly, I propose a method for “correcting” individual knowledge scores and examine its convergent and predictive validity. Based on the finding that a simple revised scale of political knowledge has greater validity than the original additive scale, I conclude that the problem of cheating can be alleviated by using the RT method, which is again summarized in the conclusion to ensure its applicability in empirical research.</p> Lionel Marquis Copyright (c) 2021 Lionel Marquis Sat, 10 Apr 2021 00:00:00 +0000 How to Reconstruct a Trend when Survey Questions Have Changed Over Time. <p><em>Many trend studies draw on survey data and compare responses to questions on the same topic that has been asked over time. A problem with such studies is that the questions often do not remain identical, due to changes in phrasing and response formats. We present ways to deal with this problem using trend data on life satisfaction in Japan as an illustrative case. Life satisfaction has been measured in the Life in Nation survey in Japan since 1958 and the question used has been changed several times. We looked at three methods published by scholars who tried to reconstruct a main trend in life satisfaction from these broken time-series, coming to different conclusions. In this paper we discuss their methods and present two new techniques for dealing with changes in survey questions on the same topic.</em></p> <p><em>&nbsp;</em></p> Tineke de Jonge, Akiko Kamesaka, Ruut Veenhoven Copyright (c) 2021 Tineke de Jonge, Akiko Kamesaka, Ruut Veenhoven Sat, 10 Apr 2021 00:00:00 +0000 Recent Methodological Advances in Panel Data Collection, Analysis, and Application <p>SRM wants to publish a Special Issue on "Recent Methodological Advances in Panel Data Collection, Analysis, and Application". This call for papers expands on the papers searched and the policies used.</p> Tobias Wolbring, Sabine Zinn Copyright (c) 2021 Tobias Wolbring, Sabine Zinn Fri, 09 Apr 2021 15:16:07 +0000