Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2024, Cilt: 32 Sayı: 3, 396 - 407, 31.07.2024
https://doi.org/10.24106/kefdergi.1525320

Öz

Kaynakça

  • Alarcon, G. M., & Lee, M. A. (2022). The relationship of insufficient effort responding and response styles: An online experiment. Frontiers in Psychology, 12. https://www.frontiersin.org/article/10.3389/fpsyg.2021.784375
  • Beach, D. A. (1989) Identifying the random responder. The Journal of Psychology, 123(1), 101-103, DOI: 10.1080/00223980.1989.10542966
  • Beck, M. F., Albano, A. D., & Smith, W. M. (2019). Person-fit as an index of inattentive responding: A comparison of methods using polytomous survey Data. Applied Psychological Measurement, 43(5), 374–387. https://doi.org/10.1177/0146621618798666
  • Bowling, N. A., Huang, J. L., Brower, C. K., & Bragg, C. B. (2021). The quick and the careless: the construct validity of page time as a measure of insufficient effort responding to surveys. Organizational Research Methods. https://doi.org/10.1177/10944281211056520
  • Brühlmann, F., Petralito, S., Aeschbach, L. F., & Opwis, K. (2020). The quality of data collected online: An investigation of careless responding in a crowdsourced sample. Methods in Psychology, 2. https://doi.org/10.1016/j.metip.2020.100022
  • Curran, P. G. (2015). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology, 66(2016), 4–19. https://doi.org/10.1016/j.jesp.2015.07.006
  • Eklöf, H., Pavešič, B. J., & Grønmo, L. S. (2014). A cross-national comparison of reported effort and mathematics performance in TIMSS Advanced. Applied Measurement in Education, 27(1), 31–45. https://doi.org/10.1080/08957347.2013.853070
  • Emons, W. H. (2008). Nonparametric person-fit analysis of polytomous item scores. Applied Psychological Measurement, 32 (3), 224–247. https://doi.org/10.1177/0146621607302479
  • Goldammer, P., Annen, H., Stöckli, P. L., & Jonas, K. (2020). Careless responding in questionnaire measures: Detection, impact, and remedies. The Leadership Quarterly, 31(4). https://doi.org/10.1016/j.leaqua.2020.101384
  • Goldhammer, F., Martens, T., Christoph, G., & Lüdtke, O. (2016). Test-taking engagement in PIAAC. Vol. 133. In: OECD Education Working Papers. Paris: OECD Publishing.
  • Grabe, W. (2009) Reading in a second language: Moving from theory to practice. Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9781139150484
  • Grau, I., Ebbeler, C., & Banse, R. (2019). Cultural differences in careless responding. Journal of Cross-Cultural Psychology, 50(3), 336–357. https://doi.org/10.1177/0022022119827379
  • Gummer, T., Roßmann, J., & Silber, H. (2021). Using instructed response items as attention checks in web surveys: Properties and implementation. Sociological Methods & Research, 50(1), 238–264. https://doi.org/10.1177/0049124118769083
  • Guo, H., Rios, J. A., Haberman, S., Liu, O. L., Wang, J. & Paek, I. (2016). A new procedure for detection of students’ rapid guessing responses using response time. Applied Measurement in Education, 29(3), 173-183, doi: 10.1080/08957347.2016.1171766
  • Huang, J. L., Curran, P. G., Keeney, J., Poposki, E. M., & DeShon, R. P. (2012). Detecting and deterring insufficient effort responding to surveys. Journal of Business and Psychology, 27(1), 99–114. https://doi.org/10.1007/s10869-011-9231-8
  • Johnson, J. A. (2005). Ascertaining the validity of individual protocols from web-based personality inventories. Journal of Research in Personality, 39, 103–129. http://dx.doi.org/10.1016/j.jrp.2004.09.009.
  • Karabatsos, G. (2003). Comparing the aberrant response detection performance of thirty-six person-fit statistics. Applied Measurement in Education, 16 (4), 277–298. https://doi.org/10.1207/S15324818AME1604_2
  • Kountur, R. (2016). Detecting careless responses to self-reported questionnaires. Eurasian Journal of Educational Research, 16(64), 1–35. https://doi.org/10.14689/ejer.2016.64.17
  • McFerran, M. W. (2022). Careless responding in survey research: an examination of individual, situational, and cultural characteristics (Doctoral dissertation). Florida Tech.
  • Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17(3), 437–455. https://doi.org/10.1037/a0028085
  • Meijer, R. R., Niessen, A. S. M., & Tendeiro, J. N. (2016). A practical guide to check the consistency of item response patterns in clinical research through person-fit statistics: Examples and a computer program. Assessment, 23(1), 52-62. https://doi.org/10.1177/10731911155778
  • Meijer, R. R., & Sijtsma, K. (2001). Methodology review: Evaluating person fit. Applied Psychological Measurement, 25(2), 107-135. https://doi.org/10.1177/0146621012203
  • Mol S. E. & Bus A. G. (2011) To read or not to read: A meta-analysis of print exposure from infancy to early adulthood. Psychological Bulletin, 137(2), 267–296. doi:10.1037/a0021890.
  • Niessen, A. S. M., Meijer, R. R., & Tendeiro, J. N. (2016). Detecting careless respondents in web-based questionnaires: Which method to use? Journal of Research in Personality, 63, 1–11. https://doi.org/10.1016/j.jrp.2016.04.010
  • Rios, J. A., Guo, H., Mao, L., & Liu, O. L. (2017). Evaluating the impact of careless responding on aggregated-scores: To filter unmotivated examinees or not? International Journal of Testing, 17(1), 74–104. https://doi.org/10.1080/15305058.2016.1231193
  • Rios, J. A., & Soland, J. (2021). Parameter estimation accuracy of the effort-moderated item response theory model under multiple assumption violations. Educational and Psychological Measurement. http://journals.sagepub.com/doi/10.1177/0013164420949896
  • Rosseel, Y. (2012). lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1-36. https://doi.org/10.18637/jss.v048.i02
  • Schroeders, U., Schmidt, C., & Gnambs, T. (2022). Detecting careless responding in survey data using stochastic gradient boosting. Educational and Psychological Measurement, 82(1), 29–56. https://doi.org/10.1177/00131644211004708
  • Soland, J., Wise, S. L., & Gao, L. (2019). Identifying disengaged survey responses: new evidence using response time metadata. Applied Measurement in Education, 32(2), 151–165. https://doi.org/10.1080/08957347.2019.1577244
  • Tendeiro, J. N., Meijer, R. R., & Niessen, A. S. M. (2016). PerFit: An R package for person-fit analysis in IRT. Journal of Statistical Software, 74(5), 1–27. https://doi.org/10.18637/jss.v074.i05
  • Ulitzsch, E., Yildirim-Erbasli, S. N., Gorgun, G., & Bulut, O. (2022). An explanatory mixture IRT model for careless and insufficient effort responding in self-report measures. British Journal of Mathematical and Statistical Psychology, 75(3). https://doi.org/10.1111/bmsp.12272
  • Ward, M. K., & Meade, A. W. (2022). Dealing with careless responding in survey data: prevention, identification, and recommended best practices. Annual Review of Psychology, 74(1). https://doi.org/10.1146/annurev-psych-040422-045007
  • Wise, S. L., & DeMars, C. E. (2006). An application of item response time: the effort-moderated IRT model. Journal of Educational Measurement, 43(1), 19–38.
  • Wise, S. L., & Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18(2), 163–183. https://doi.org/10.1207/s15324818ame1802_2
  • Yentes R.D., & Wilhelm, F. (2021). Careless: Procedures for computing indices of careless responding. R package version 1.2.1. https://cran.r-project.org/web/packages/careless
  • Zhang C. & Conrad F. G. (2014). Speeding in web surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods, 8(2), 127–135. https://doi.org/10.18148/srm/2014.v8i2.5453

Identifying the Careless Responders: A Cross-Country Comparison on PISA 2018 Dataset

Yıl 2024, Cilt: 32 Sayı: 3, 396 - 407, 31.07.2024
https://doi.org/10.24106/kefdergi.1525320

Öz

Careless responding (CR) is one of the major concerns in self-report measures since it affects the quality of collected data. In this research, it was aimed to examine the dataset in terms of CR, to investigate the effects of CR on the psychometric properties of data, and to examine the effectiveness of CR detection methods in a cross-cultural context. Specifically; response time, outlier and long-string indices were used to flag CR and efficiency of using these were compared on PISA 2018 reading related attitude scales for Singapore and Turkey. As a result, it was indicated that the amount of CR was higher for Singapore than Turkey at long-string and response time methods. Furthermore, after removal of CR from dataset, enjoyment level of reading, perception of competence and perception of difficulty of PISA tests scores increased. Another result was that long-string and response time analyses provided significant improvement in model fit and long-string had the highest improvement level. Moreover, removing respondents according to outlier analyses didn’t provide any improvement on model-data fit. In general, results illustrated that careless responding behavior may have important effects on psychometric properties of self-report and screening the data for CR is strongly suggested in future studies.

Kaynakça

  • Alarcon, G. M., & Lee, M. A. (2022). The relationship of insufficient effort responding and response styles: An online experiment. Frontiers in Psychology, 12. https://www.frontiersin.org/article/10.3389/fpsyg.2021.784375
  • Beach, D. A. (1989) Identifying the random responder. The Journal of Psychology, 123(1), 101-103, DOI: 10.1080/00223980.1989.10542966
  • Beck, M. F., Albano, A. D., & Smith, W. M. (2019). Person-fit as an index of inattentive responding: A comparison of methods using polytomous survey Data. Applied Psychological Measurement, 43(5), 374–387. https://doi.org/10.1177/0146621618798666
  • Bowling, N. A., Huang, J. L., Brower, C. K., & Bragg, C. B. (2021). The quick and the careless: the construct validity of page time as a measure of insufficient effort responding to surveys. Organizational Research Methods. https://doi.org/10.1177/10944281211056520
  • Brühlmann, F., Petralito, S., Aeschbach, L. F., & Opwis, K. (2020). The quality of data collected online: An investigation of careless responding in a crowdsourced sample. Methods in Psychology, 2. https://doi.org/10.1016/j.metip.2020.100022
  • Curran, P. G. (2015). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology, 66(2016), 4–19. https://doi.org/10.1016/j.jesp.2015.07.006
  • Eklöf, H., Pavešič, B. J., & Grønmo, L. S. (2014). A cross-national comparison of reported effort and mathematics performance in TIMSS Advanced. Applied Measurement in Education, 27(1), 31–45. https://doi.org/10.1080/08957347.2013.853070
  • Emons, W. H. (2008). Nonparametric person-fit analysis of polytomous item scores. Applied Psychological Measurement, 32 (3), 224–247. https://doi.org/10.1177/0146621607302479
  • Goldammer, P., Annen, H., Stöckli, P. L., & Jonas, K. (2020). Careless responding in questionnaire measures: Detection, impact, and remedies. The Leadership Quarterly, 31(4). https://doi.org/10.1016/j.leaqua.2020.101384
  • Goldhammer, F., Martens, T., Christoph, G., & Lüdtke, O. (2016). Test-taking engagement in PIAAC. Vol. 133. In: OECD Education Working Papers. Paris: OECD Publishing.
  • Grabe, W. (2009) Reading in a second language: Moving from theory to practice. Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9781139150484
  • Grau, I., Ebbeler, C., & Banse, R. (2019). Cultural differences in careless responding. Journal of Cross-Cultural Psychology, 50(3), 336–357. https://doi.org/10.1177/0022022119827379
  • Gummer, T., Roßmann, J., & Silber, H. (2021). Using instructed response items as attention checks in web surveys: Properties and implementation. Sociological Methods & Research, 50(1), 238–264. https://doi.org/10.1177/0049124118769083
  • Guo, H., Rios, J. A., Haberman, S., Liu, O. L., Wang, J. & Paek, I. (2016). A new procedure for detection of students’ rapid guessing responses using response time. Applied Measurement in Education, 29(3), 173-183, doi: 10.1080/08957347.2016.1171766
  • Huang, J. L., Curran, P. G., Keeney, J., Poposki, E. M., & DeShon, R. P. (2012). Detecting and deterring insufficient effort responding to surveys. Journal of Business and Psychology, 27(1), 99–114. https://doi.org/10.1007/s10869-011-9231-8
  • Johnson, J. A. (2005). Ascertaining the validity of individual protocols from web-based personality inventories. Journal of Research in Personality, 39, 103–129. http://dx.doi.org/10.1016/j.jrp.2004.09.009.
  • Karabatsos, G. (2003). Comparing the aberrant response detection performance of thirty-six person-fit statistics. Applied Measurement in Education, 16 (4), 277–298. https://doi.org/10.1207/S15324818AME1604_2
  • Kountur, R. (2016). Detecting careless responses to self-reported questionnaires. Eurasian Journal of Educational Research, 16(64), 1–35. https://doi.org/10.14689/ejer.2016.64.17
  • McFerran, M. W. (2022). Careless responding in survey research: an examination of individual, situational, and cultural characteristics (Doctoral dissertation). Florida Tech.
  • Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17(3), 437–455. https://doi.org/10.1037/a0028085
  • Meijer, R. R., Niessen, A. S. M., & Tendeiro, J. N. (2016). A practical guide to check the consistency of item response patterns in clinical research through person-fit statistics: Examples and a computer program. Assessment, 23(1), 52-62. https://doi.org/10.1177/10731911155778
  • Meijer, R. R., & Sijtsma, K. (2001). Methodology review: Evaluating person fit. Applied Psychological Measurement, 25(2), 107-135. https://doi.org/10.1177/0146621012203
  • Mol S. E. & Bus A. G. (2011) To read or not to read: A meta-analysis of print exposure from infancy to early adulthood. Psychological Bulletin, 137(2), 267–296. doi:10.1037/a0021890.
  • Niessen, A. S. M., Meijer, R. R., & Tendeiro, J. N. (2016). Detecting careless respondents in web-based questionnaires: Which method to use? Journal of Research in Personality, 63, 1–11. https://doi.org/10.1016/j.jrp.2016.04.010
  • Rios, J. A., Guo, H., Mao, L., & Liu, O. L. (2017). Evaluating the impact of careless responding on aggregated-scores: To filter unmotivated examinees or not? International Journal of Testing, 17(1), 74–104. https://doi.org/10.1080/15305058.2016.1231193
  • Rios, J. A., & Soland, J. (2021). Parameter estimation accuracy of the effort-moderated item response theory model under multiple assumption violations. Educational and Psychological Measurement. http://journals.sagepub.com/doi/10.1177/0013164420949896
  • Rosseel, Y. (2012). lavaan: An R package for structural equation modeling. Journal of Statistical Software, 48(2), 1-36. https://doi.org/10.18637/jss.v048.i02
  • Schroeders, U., Schmidt, C., & Gnambs, T. (2022). Detecting careless responding in survey data using stochastic gradient boosting. Educational and Psychological Measurement, 82(1), 29–56. https://doi.org/10.1177/00131644211004708
  • Soland, J., Wise, S. L., & Gao, L. (2019). Identifying disengaged survey responses: new evidence using response time metadata. Applied Measurement in Education, 32(2), 151–165. https://doi.org/10.1080/08957347.2019.1577244
  • Tendeiro, J. N., Meijer, R. R., & Niessen, A. S. M. (2016). PerFit: An R package for person-fit analysis in IRT. Journal of Statistical Software, 74(5), 1–27. https://doi.org/10.18637/jss.v074.i05
  • Ulitzsch, E., Yildirim-Erbasli, S. N., Gorgun, G., & Bulut, O. (2022). An explanatory mixture IRT model for careless and insufficient effort responding in self-report measures. British Journal of Mathematical and Statistical Psychology, 75(3). https://doi.org/10.1111/bmsp.12272
  • Ward, M. K., & Meade, A. W. (2022). Dealing with careless responding in survey data: prevention, identification, and recommended best practices. Annual Review of Psychology, 74(1). https://doi.org/10.1146/annurev-psych-040422-045007
  • Wise, S. L., & DeMars, C. E. (2006). An application of item response time: the effort-moderated IRT model. Journal of Educational Measurement, 43(1), 19–38.
  • Wise, S. L., & Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18(2), 163–183. https://doi.org/10.1207/s15324818ame1802_2
  • Yentes R.D., & Wilhelm, F. (2021). Careless: Procedures for computing indices of careless responding. R package version 1.2.1. https://cran.r-project.org/web/packages/careless
  • Zhang C. & Conrad F. G. (2014). Speeding in web surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods, 8(2), 127–135. https://doi.org/10.18148/srm/2014.v8i2.5453
Toplam 36 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Alan Eğitimleri (Diğer)
Bölüm Research Article
Yazarlar

Başak Erdem Kara 0000-0003-3066-2892

Yayımlanma Tarihi 31 Temmuz 2024
Kabul Tarihi 11 Temmuz 2024
Yayımlandığı Sayı Yıl 2024 Cilt: 32 Sayı: 3

Kaynak Göster

APA Erdem Kara, B. (2024). Identifying the Careless Responders: A Cross-Country Comparison on PISA 2018 Dataset. Kastamonu Education Journal, 32(3), 396-407. https://doi.org/10.24106/kefdergi.1525320

10037