Attention Check Items and Instructions in Online Surveys with Incentivized and Non-Incentivized Samples : Boon or Bane for Data Quality?
In this paper, we examine rates of careless responding and reactions to detection methods (i.e., attention check items and instructions) in an experimental setting based on two different samples. First, we use a quota sample (with monetary incentive), a central data source for internet-based surveys in sociological and political research. Second, we include a voluntary opt-in panel (without monetary incentive) well suited for conducting survey experiments (e.g., factorial surveys). Respondents' reactions to the detection items are analyzed by objective, nonreactive indicators (i.e., break-off, item non-response, and measurement quality), and two self-report scales. Our reaction analyses reveal that the detection methods we applied are not only well suited for identifying careless respondents, but also exert a motivational rather than a demotivating influence on respondents' answer behavior and, hence, contribute to data quality. Furthermore, we find that break-off behavior differs across both samples suggesting that results from methodological online studies on the basis of incentivized samples do not necessarily transfer to online studies in general