Breadcrumbs Section. Click here to navigate to respective pages.
Chapter

Chapter
Response Effects and Cognitive Involvement in Answering Survey Questions
DOI link for Response Effects and Cognitive Involvement in Answering Survey Questions
Response Effects and Cognitive Involvement in Answering Survey Questions book
Response Effects and Cognitive Involvement in Answering Survey Questions
DOI link for Response Effects and Cognitive Involvement in Answering Survey Questions
Response Effects and Cognitive Involvement in Answering Survey Questions book
ABSTRACT
This chapter discusses the "indirect methods": the Randomized Response Technique (RRT), a recent variant of the RRT called the Crosswise Model (CM) and the Item Count Technique (ICT). The empirical research indicates that, if the design is set up carefully and the implementation is tailored to the specific topic and population under investigation, the above mentioned techniques can diminish the problem of social desirability bias and improve the validity of survey measures. When deciding on which method to use, it might be relevant that RRT and ICT variants to measure quantitative variables exist, while the CM used to study dichotomous variables. The chapter explores the CM yielded consistently higher prevalence estimates than direct questioning for a number of items about exam cheating and plagiarism. The indirect methods likely differ also in their viability and empirical practicability for different respondent subgroups. Promising route for future research would be to combine question-and-answer approaches with alternative strategies for collecting sensitive data.