Puzzling Answers to Crosswise Questions: Examining Overall Prevalence Rates, Response Order Effects, and Learning Effects

Authors

  • Sandra Walzenbach
  • Thomas Hinz

DOI:

https://doi.org/10.18148/srm/2023.v17i1.8010

Keywords:

crosswise model, randomized response, social desirability bias, primacy effects, learning effects, panel conditioning, privacy concerns

Abstract

This validation study on the crosswise model (CM) examines five survey experiments that were implemented in a general population survey. Our first crucial result is that in none of these experiments was the crosswise model able to verifiably reduce social desirability bias. In contrast to most previous CM applications, we use an experimental design that allows us to distinguish a reduction in social desirability bias from heuristic response behaviour, such as random ticking, leading to false positive or false negative answers. In addition, we provide insights on two potential explanatory mechanisms that have not yet received attention in empirical studies: primacy effects and panel conditioning. We do not find consistent primacy effects, nor does response quality improve due to learning when respondents have had experiences with crosswise models in past survey waves. We interpret our results as evidence that the crosswise model does not work in general population surveys and speculate that the question format causes mistrust in participants.

Downloads

Published

2022-11-08

How to Cite

Walzenbach, S., & Hinz, T. (2022). Puzzling Answers to Crosswise Questions: Examining Overall Prevalence Rates, Response Order Effects, and Learning Effects. Survey Research Methods, 17(1), 1–13. https://doi.org/10.18148/srm/2023.v17i1.8010

Issue

Section

Articles

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.