Survey Research Methods https://ojs.ub.uni-konstanz.de/srm <p>Survey Research Methods is the official peer-reviewed journal of the European Survey Research Association (ESRA). The journal publishes articles in English, which discuss methodological issues related to survey research.</p> en-US <p><a href="https://ojs.ub.uni-konstanz.de/srm/copyright" target="_blank" rel="noopener">Copyright Notice</a></p> surveyresearchmethods@uni-konstanz.de (Andre Pirralha) surveyresearchmethods@uni-konstanz.de (Publication Officer) Tue, 10 Aug 2021 16:11:55 +0000 OJS 3.1.2.1 http://blogs.law.harvard.edu/tech/rss 60 Establishing a baseline: bringing innovation to the evaluation of cross-national probability-based online panels https://ojs.ub.uni-konstanz.de/srm/article/view/7457 A number of countries in Europe and beyond have established online or mixed mode panels with a web component based upon probability samples. This paper aims to assess the first ever attempt to do this cross-nationally using an input harmonised approach representing a major innovation in cross-national survey methodology. The European Social Survey established a panel using the face-to-face interview as the basis for recruitment to the panel. This experiment was conducted in Estonia, Slovenia and Great Britain using an input harmonised approach to the design throughout something never done before across multiple countries simultaneously. The paper outlines how the experiment was conducted. It then moves on to compare the web panel respondents to the ESS achieved face-to-face sample in each country, as well as comparing the web panel achieved sample to external benchmarks. Most importantly, since the literature is very scarce, the differences in attitudinal and behavioural characteristics are also assessed. By comparing the answers of the total achieved sample in the ESS to the subset who also answered the CRONOS web panel we assess changes in representativeness and substantive answers without confounding the findings with other changes such as mode effects. This approach is only possible where ‘piggybacking’ recruitment has been used. This in itself is rare at the national level but this is the first time survey methodologists have employed this cross-nationally allowing such an analytical approach. Our findings suggest that the CRONOS sample is not too divergent from the target population and to the ESS with the exception of the oldest age groups. However, there are cross-national differences suggesting cross-national comparability might be different when compared to estimates from a face-to-face survey. Gianmaria Bottoni, Rory Fitzgerald Copyright (c) 2021 Gianmaria Bottoni, Rory Fitzgerald https://creativecommons.org/licenses/by-nc/4.0 https://ojs.ub.uni-konstanz.de/srm/article/view/7457 Tue, 10 Aug 2021 00:00:00 +0000 Dependent interviewing: a remedy or a curse for measurement error in surveys? https://ojs.ub.uni-konstanz.de/srm/article/view/7640 <p>Longitudinal surveys often rely on dependent interviewing (DI) to lower the<br>levels of random measurement error in survey data and reduce the incidence<br>of spurious change. DI refers to a data collection technique that incorporates<br>information from prior interview rounds into subsequent waves. While this<br>method is considered an e ective remedy for random measurement error,<br>it can also introduce more systematic errors, in particular when respondents<br>are rst reminded of their previously provided answer and then asked<br>about their current status. The aim of this paper is to assess the impact<br>of DI on measurement error in employment mobility. We take advantage<br>of a unique experimental situation that was created by the roll-out of dependent<br>interviewing in the Dutch Labour Force Survey (LFS). We apply<br>Hidden Markov Modeling (HMM) to linked LFS and Employment Register<br>(ER) data that cover a period before and after dependent interviewing was<br>abolished, which in turn enables the modeling of systematic errors in the<br>LFS data. Our results indicate that DI lowered the probability of obtaining<br>random measurement error but had no signi cant e ect on the systematic<br>component of the error. The lack of a signi cant e ect might be partially<br>due to the fact that the probability of repeating the same error was extremely<br>high at baseline (i.e when using standard, independent interviewing);<br>therefore the use of DI could not increase this probability any further.</p> Paulina Pankowska, Bart Bakker, Daniel Oberski, Dimitris Pavlopoulos Copyright (c) 2021 Paulina Pankowska, Bart Bakker, Daniel Oberski, Dimitris Pavlopoulos https://creativecommons.org/licenses/by-nc/4.0 https://ojs.ub.uni-konstanz.de/srm/article/view/7640 Tue, 10 Aug 2021 00:00:00 +0000 Enhancing the Demand for Labour survey by including skills from online job advertisements using model-assisted calibration https://ojs.ub.uni-konstanz.de/srm/article/view/7670 <div class="page" title="Page 1"> <div class="layoutArea"> <div class="column"> <p>In the article we describe an enhancement to the Demand for Labour (DL) survey con- ducted by Statistics Poland, which involves the inclusion of skills obtained from online job advertisements. The main goal is to provide estimates of the demand for skills (competences), which is missing in the DL survey. To achieve this, we apply a data integration approach com- bining traditional calibration with the LASSO-assisted approach to correct coverage and selec- tion error in the online data. Faced with the lack of access to unit-level data from the DL survey, we use estimated population totals and propose a bootstrap approach that accounts for the un- certainty of totals reported by Statistics Poland. We show that the calibration estimator assisted with LASSO outperforms traditional calibration in terms of standard errors and reduces rep- resentation bias in skills observed in online job ads. Our empirical results show that online data significantly overestimate interpersonal, managerial and self-organization skills while un- derestimating technical and physical skills. This is mainly due to the under-representation of occupations categorised as Craft and Related Trades Workers and Plant and Machine Operators and Assemblers.</p> </div> </div> </div> Maciej Eryk Beręsewicz, Greta Białkowska, Krzysztof Marcinkowski, Magdalena Maślak, Piotr Opiela, Pawlukiewicz Katarzyna, Robert Pater Copyright (c) 2021 Maciej Eryk Beręsewicz, Robert Pater https://creativecommons.org/licenses/by-nc/4.0 https://ojs.ub.uni-konstanz.de/srm/article/view/7670 Fri, 15 Jan 2021 00:00:00 +0000 Directional Pattern based Clustering for Quantitative Survey Data: Method and Application https://ojs.ub.uni-konstanz.de/srm/article/view/7773 <p>Analysis of survey data is a matter of significant concern as it plays a key role in organizational and behavioral research. Quantitative survey data possesses several distinct characteristics i.e., fixed small range of ordinal values, importance of respondent category labels etc. Due to such reasons quantitative survey data is not appropriate for existing analysis methods involving aggregate statistics. Literature has advised to utilize pattern based analysis tools instead of aggregate statistics since patterns are more informative and efficient in reflecting respondents’ preferences. Thus, we introduce a specialized pattern based clustering technique for survey data that uses the convention of direction instead of magnitude. Further, it does not require manual setting of clustering parameters whereas it automatically identifies respondent categories and their representative features with the help of an adaptive procedure. We apply proposed method over an original academic survey dataset and compare its results with K-Means clustering method in terms of interpretability and usability. We utilize benchmark stakeholder theory to verify the results. Results suggest that proposed pattern clustering method performs far better in segregating survey responses according to the stakeholder theory and the clusters made by it are much more meaningful. Hence, results empirically validates that pattern based analysis methods are more suitable for analyzing quantitative survey data.</p> Roopam Sadh, Rajeev Kumar Copyright (c) 2021 Roopam Sadh, Rajeev Kumar https://creativecommons.org/licenses/by-nc/4.0 https://ojs.ub.uni-konstanz.de/srm/article/view/7773 Tue, 10 Aug 2021 00:00:00 +0000 Evaluation of Estimated Survey Duration Equations Using a Health Risk Assessment https://ojs.ub.uni-konstanz.de/srm/article/view/7800 <p>Survey duration—the time it takes to complete a survey—affects response and completion rates. Estimated survey duration equations may be used to estimate survey duration, however, there are no studies assessing their use. The objective of this study is to evaluate estimated survey duration equations using a health risk assessment. Six existing estimated survey duration equations were identified. Using health risk assessment data from January 1, 2018 to December 31, 2018, an average participant profile was built to inform the inputs into the estimated survey duration equations. Estimated survey duration of the health risk assessment ranged from 7.64 minutes to 39.6 minutes. Using the same dataset, the estimated survey duration was compared to the actual completion time of the health risk assessment. The average completion time of the health risk assessment was 11.27 minutes. The estimated survey duration equations either under- or overestimated the completion time of the health risk assessment. The equation that is based on word count, number of questions, decisions, and open text boxes is recommended for use to estimate the duration of a health risk assessment although it was an overestimate. Using estimated survey duration equations appear to be a suitable alternative to pilot testing but future studies are needed to further evaluate these equations in other types of surveys. &nbsp;</p> Brittany Carter, James Bennett, Elric Sims Copyright (c) 2021 Brittany Carter https://creativecommons.org/licenses/by-nc/4.0 https://ojs.ub.uni-konstanz.de/srm/article/view/7800 Tue, 10 Aug 2021 00:00:00 +0000 More Clarification, Less Item Nonresponse in Establishment Surveys? A Split-Ballot Experiment https://ojs.ub.uni-konstanz.de/srm/article/view/7809 <p>The IAB Job Vacancy Survey of the German Institute for Employment Research collects detailed information on job search and vacancy durations for an establishment’s last successful hiring process. The duration questions themselves are burdensome for respondents to answer as they ask for precise dates of the earliest possible hiring for the vacancy, the start of the personnel search, and the decision to hire the selected applicant. Consequently, the nonresponse rates for these items have been relatively high over the years (up to 21 percent). In an effort to reduce item nonresponse, a split-ballot experiment was conducted to test the strategy of providing additional clarifying information and examples to assist respondents in answering the date questions. The results revealed a backfiring effect. Although there was evidence that respondents read the additional clarifying information, this led to even more item nonresponse and lower data quality compared to the control group. Additionally, we observed a negative spillover effect with regard to item nonresponse on a subsequent (non-treated) question. We conclude this article by discussing possible causes of these results and suggestions for further research.</p> Benjamin Küfner, Joseph W. Sakshaug, Stefan Zins Copyright (c) 2021 Benjamin Küfner, Joseph W. Sakshaug, Stefan Zins https://creativecommons.org/licenses/by-nc/4.0 https://ojs.ub.uni-konstanz.de/srm/article/view/7809 Tue, 10 Aug 2021 00:00:00 +0000