BibTex RIS Kaynak Göster

Detecting Careless Responses to Self-Reported Questionnaires

Yıl 2016, Cilt: 16 Sayı: 64, 0 - 0, 15.07.2016

Öz

Problem Statement: The use of self-report questionnaires may lead to biases such as careless responses that distort the research outcomes. Early detection of careless responses in self-report questionnaires may reduce error, but little guidance exists in the literature regarding techniques for detecting such careless or random responses in self-report questionnaires. Purpose of the Study: The purpose of this study was to examine whether the respondent‟s goodness-of-fit test score (RGF) can be used to indicate careless responses in completing self-report questionnaires. It is hypothesized that there is a significant difference of RGF between careless responses and true responses and that RGF of careless responses is higher than RGF for true responses. Method: An experimental research design that made use of a self-reported questionnaire was conducted with 205 respondents divided into two groups. The first group responded truthfully to the questionnaire while the second group responded carelessly to the questionnaire. The validity and reliability of the questionnaire had been tested. One hundred and eighty five respondents were selected as the group of true responses, while another 20 respondents comprised the group of careless responses. T-test of independent sample was used to evaluate the different RGF among true responses and careless responses. Findings: After comparing the mean scores of RGF between careless responses and true responses, a significant difference was found. The frequency distribution of true responses tends to be normally distributed while the existence of careless responses creates a skewed distribution to the right. The RGF of careless responses is higher than the RGF of true responses. Conclusion and Recommendations: RGF may be used as an indicator of respondent‟s careless responses in self-report questionnaires in which more accurate data are expected. Social science research that makes use of self-report questionnaire in measuring affective domain may compute RGF to determine whether careless responses exist. Keywords: Careless response, Questionnaire development, Random response, Goodness-of-fit.

Kaynakça

  • Begin, G., Boivin, M. & Bellerose, J. (1979). Sensitive data collection through the
  • random response technique: Some improvements. Journal of Psychology.
  • (1), 53-65.
  • Castro, R. (2013). Inconsistent respondents and sensitive questions. Field Methods.
  • (3), 283-298. doi: 10.1177/1525822X12466988
  • Conijn, J.M., Emons, W.H.M., Van Assen, M.A.L.M., Pedersen, S.S., & Sijtsma, K.
  • (2013). Explanatory, multilevel person-fit analysis of response consistency on
  • the Spielberger state-trait anxiety inventory. Multivariate Behavioral Research,
  • (5), 692-718.
  • Crede, M. (2010). Random responding as a threat to the validity of effect size
  • estimates in correlational research. Educational and Psychological Measurement,
  • (4), 596-612.
  • Crino, M.D. (1985). The random response technique as an indicator of questionnaire
  • item social desirability/personal sensitivity. Educational and Psychological
  • Measurement, 45(3), 453-468.
  • Godinho, A., Kushnir, V., & Cunningham, J.A. (2016). Unfaithful findings:
  • Identifying careless responding in addictions research. Addiction, 111(6), 955-
  • Keeley, J.W., Webb, C., Peterson, D., Roussin, L., & Flanagan, E.H. (2016).
  • Development of a response inconsistency scale for the personality inventory
  • for DSM-5. Journal of Personality Assessment, 98(4), 351-359.
  • Escobal, J., & Benites, S. (2013). PDAs in socio-economic surveys: instrument bias,
  • surveyor bias or both? International Journal of Social Research Methodology,
  • (1), 47-63. doi: 10.1080/13645579.2011.648420
  • Garcia, A.A. (2011). Cognitive interviews to test and refine questionnaires. Public
  • Health Nursing, 28(5), 444-450. doi: 10.1111/j.1525-1446.2010.00938.x
  • Johnson, T. P., & Wislar, J. S. (2012). Response rates and nonresponse errors in
  • surveys. Journal of the American Medical Association, 307(17), 1805-1806.
  • Keller, G. & Warrack, B. (2016). Statistics for management and economics, 9th ed.
  • Australia: Thomson.
  • Lara, D., Garcia, S.G., Ellertson, C., Camlin, C. & Suarez, J. (2006). The measure of
  • induced abortion levels in Mexico using random response technique.
  • Sociaological Method & Research, 35(2), 279-301.
  • Meade, A.W., & Craig, S.B. (2012). Identifying careless responses in survey data.
  • Psychological Methods, 17(3), 437-455.
  • Meyer, J.F., Fraust, K.A., Faust, D., Baker, A.M., & Cook, N.E. (2013). Careless and
  • random responding on clinical and research measures in the addictions: A
  • concerning problem and investigation of their detection. International Journal
  • of Mental Health and Addiction, 7(3), 292-306.
  • Penwarden, R. (2013, August). How to avoid nonresponse bias. FluidSurveys.
  • Retrieved from http://fluidsurveys.com/how-to-avoid-nonresponse-bias.
  • Reaser, J.M. (1975). A test of the forced-alternative random response questionnaire
  • technique. Technical Report no. 75-9.
  • Rogelberg, S.G., Fisher, G.G., Maynard, D.C., Hakel, M.D., & Horvath, M. (2001).
  • Attitudes toward surveys: Development of a measure and its relationship to
  • respondent behavior. Organizational Research Methods, 4(1), 3-25.
  • Rousseau, B., & Ennis, J.M. (2013). Importance of correct instructions in the Tetrad
  • test. Journal of Sensory Studies, 28(4), 264-269. doi: 10.1111/joss.12049
  • Spector, Paul E. (1992). Summated rating scale construction: An introduction.
  • California: Sage Publications, Inc. p. 30-31.
  • Sriramatr, S., Berry, T.R., Rodgers, W., & Stolp, S. (2012). The effect of different
  • response formats on ratings of exerciser stereotypes. Social Behavior and
  • Personality, 40(10), 1655-1666.
  • Summers, G.F., & Hammonds, A.D. (1969). Toward a paradigm for respondent bias
  • in survey research. Sociological Quarterly, 10(1), 113-121.
  • Tabachnick, B. G. & Fidel L. S. (2013). Using multivariate statistics, 6th ed. Boston:
  • Pearson, p. 661.
  • Tatsuoka & Tatsuoka (as cited in Harnisch, 1981). Analysis of item response
  • patterns: Consistency indices and their application to criterion-reference tests.
  • Paper presented at the Annual Meeting of the American Educational Research
  • Association, Long Angeles, April 13-17, 1981.
  • Thompson, B., Melancon, J.G., & Kier, F.J. (1998). Faking/random response scales for
  • the PPSDQ-93 measure of Jungian personality types. Paper presented at the
  • annual meeting of the Southwestern Psychological Association (new Orleans, L.A.,
  • April).
  • Warner, S. L. (as cited in Begin, Boivin & Bellerose, 1979). Sensitive data collection
  • through the random response technique: Some improvements. Journal of
  • Psychology, 101(1), 53-65.
Yıl 2016, Cilt: 16 Sayı: 64, 0 - 0, 15.07.2016

Öz

Kaynakça

  • Begin, G., Boivin, M. & Bellerose, J. (1979). Sensitive data collection through the
  • random response technique: Some improvements. Journal of Psychology.
  • (1), 53-65.
  • Castro, R. (2013). Inconsistent respondents and sensitive questions. Field Methods.
  • (3), 283-298. doi: 10.1177/1525822X12466988
  • Conijn, J.M., Emons, W.H.M., Van Assen, M.A.L.M., Pedersen, S.S., & Sijtsma, K.
  • (2013). Explanatory, multilevel person-fit analysis of response consistency on
  • the Spielberger state-trait anxiety inventory. Multivariate Behavioral Research,
  • (5), 692-718.
  • Crede, M. (2010). Random responding as a threat to the validity of effect size
  • estimates in correlational research. Educational and Psychological Measurement,
  • (4), 596-612.
  • Crino, M.D. (1985). The random response technique as an indicator of questionnaire
  • item social desirability/personal sensitivity. Educational and Psychological
  • Measurement, 45(3), 453-468.
  • Godinho, A., Kushnir, V., & Cunningham, J.A. (2016). Unfaithful findings:
  • Identifying careless responding in addictions research. Addiction, 111(6), 955-
  • Keeley, J.W., Webb, C., Peterson, D., Roussin, L., & Flanagan, E.H. (2016).
  • Development of a response inconsistency scale for the personality inventory
  • for DSM-5. Journal of Personality Assessment, 98(4), 351-359.
  • Escobal, J., & Benites, S. (2013). PDAs in socio-economic surveys: instrument bias,
  • surveyor bias or both? International Journal of Social Research Methodology,
  • (1), 47-63. doi: 10.1080/13645579.2011.648420
  • Garcia, A.A. (2011). Cognitive interviews to test and refine questionnaires. Public
  • Health Nursing, 28(5), 444-450. doi: 10.1111/j.1525-1446.2010.00938.x
  • Johnson, T. P., & Wislar, J. S. (2012). Response rates and nonresponse errors in
  • surveys. Journal of the American Medical Association, 307(17), 1805-1806.
  • Keller, G. & Warrack, B. (2016). Statistics for management and economics, 9th ed.
  • Australia: Thomson.
  • Lara, D., Garcia, S.G., Ellertson, C., Camlin, C. & Suarez, J. (2006). The measure of
  • induced abortion levels in Mexico using random response technique.
  • Sociaological Method & Research, 35(2), 279-301.
  • Meade, A.W., & Craig, S.B. (2012). Identifying careless responses in survey data.
  • Psychological Methods, 17(3), 437-455.
  • Meyer, J.F., Fraust, K.A., Faust, D., Baker, A.M., & Cook, N.E. (2013). Careless and
  • random responding on clinical and research measures in the addictions: A
  • concerning problem and investigation of their detection. International Journal
  • of Mental Health and Addiction, 7(3), 292-306.
  • Penwarden, R. (2013, August). How to avoid nonresponse bias. FluidSurveys.
  • Retrieved from http://fluidsurveys.com/how-to-avoid-nonresponse-bias.
  • Reaser, J.M. (1975). A test of the forced-alternative random response questionnaire
  • technique. Technical Report no. 75-9.
  • Rogelberg, S.G., Fisher, G.G., Maynard, D.C., Hakel, M.D., & Horvath, M. (2001).
  • Attitudes toward surveys: Development of a measure and its relationship to
  • respondent behavior. Organizational Research Methods, 4(1), 3-25.
  • Rousseau, B., & Ennis, J.M. (2013). Importance of correct instructions in the Tetrad
  • test. Journal of Sensory Studies, 28(4), 264-269. doi: 10.1111/joss.12049
  • Spector, Paul E. (1992). Summated rating scale construction: An introduction.
  • California: Sage Publications, Inc. p. 30-31.
  • Sriramatr, S., Berry, T.R., Rodgers, W., & Stolp, S. (2012). The effect of different
  • response formats on ratings of exerciser stereotypes. Social Behavior and
  • Personality, 40(10), 1655-1666.
  • Summers, G.F., & Hammonds, A.D. (1969). Toward a paradigm for respondent bias
  • in survey research. Sociological Quarterly, 10(1), 113-121.
  • Tabachnick, B. G. & Fidel L. S. (2013). Using multivariate statistics, 6th ed. Boston:
  • Pearson, p. 661.
  • Tatsuoka & Tatsuoka (as cited in Harnisch, 1981). Analysis of item response
  • patterns: Consistency indices and their application to criterion-reference tests.
  • Paper presented at the Annual Meeting of the American Educational Research
  • Association, Long Angeles, April 13-17, 1981.
  • Thompson, B., Melancon, J.G., & Kier, F.J. (1998). Faking/random response scales for
  • the PPSDQ-93 measure of Jungian personality types. Paper presented at the
  • annual meeting of the Southwestern Psychological Association (new Orleans, L.A.,
  • April).
  • Warner, S. L. (as cited in Begin, Boivin & Bellerose, 1979). Sensitive data collection
  • through the random response technique: Some improvements. Journal of
  • Psychology, 101(1), 53-65.
Toplam 67 adet kaynakça vardır.

Ayrıntılar

Bölüm Makaleler
Yazarlar

Ronny Kountur Bu kişi benim

Yayımlanma Tarihi 15 Temmuz 2016
Yayımlandığı Sayı Yıl 2016 Cilt: 16 Sayı: 64

Kaynak Göster

APA Kountur, R. (2016). Detecting Careless Responses to Self-Reported Questionnaires. Eurasian Journal of Educational Research, 16(64).
AMA Kountur R. Detecting Careless Responses to Self-Reported Questionnaires. Eurasian Journal of Educational Research. Ekim 2016;16(64).
Chicago Kountur, Ronny. “Detecting Careless Responses to Self-Reported Questionnaires”. Eurasian Journal of Educational Research 16, sy. 64 (Ekim 2016).
EndNote Kountur R (01 Ekim 2016) Detecting Careless Responses to Self-Reported Questionnaires. Eurasian Journal of Educational Research 16 64
IEEE R. Kountur, “Detecting Careless Responses to Self-Reported Questionnaires”, Eurasian Journal of Educational Research, c. 16, sy. 64, 2016.
ISNAD Kountur, Ronny. “Detecting Careless Responses to Self-Reported Questionnaires”. Eurasian Journal of Educational Research 16/64 (Ekim 2016).
JAMA Kountur R. Detecting Careless Responses to Self-Reported Questionnaires. Eurasian Journal of Educational Research. 2016;16.
MLA Kountur, Ronny. “Detecting Careless Responses to Self-Reported Questionnaires”. Eurasian Journal of Educational Research, c. 16, sy. 64, 2016.
Vancouver Kountur R. Detecting Careless Responses to Self-Reported Questionnaires. Eurasian Journal of Educational Research. 2016;16(64).