Research Article
BibTex RIS Cite

A novel approach for calculating the item discrimination for Likert type of scales

Year 2022, Volume: 9 Issue: 3, 772 - 786, 30.09.2022
https://doi.org/10.21449/ijate.1173356

Abstract

Item analysis is performed by developers as an integral part of the scale development process. Thus, items are excluded from the scale depending on the item analysis prior to the factor analysis. Existing item discrimination indices are calculated based on correlation, yet items with different response patterns are likely to have a similar item discrimination index. This study proposed a new item discrimination index that can be used in Likert type of scales and examined its effect on factor analysis results. For this purpose, simulative datasets were generated, and items were excluded from the analysis according to the .20, .30 and .35 item discrimination index criteria, and exploratory factor analysis was performed for a single factor. Accordingly, it was found that more variance could be explained by a single factor with fewer items compared to other discrimination indices when the .20 criterion of the slope coefficient was used as suggested in this study. Similar findings were obtained using the .35 criterion with other discrimination indices. In this context, it is recommended to use the slope coefficient as an additional discrimination index calculation method in the scale development process.

References

  • Beuchert, A.K., & Mendoza, J.L. (1979). A Monte Carlo comparison of ten item discrimination indices. Journal of Educational Measurement, 16(2), 109 117. http://www.jstor.org/stable/1434454
  • Crocker, L., & Algina, J. (2008). Introduction to Classical and Modern Test Theory. Cengage Learning.
  • Cum, S. (2021). Examining the discrimination of binary scored test items with ROC analysis. International Journal of Assessment Tools in Education, 8(4), 948-958. https://doi.org/10.21449/ijate.894851
  • Engelhart, M.D. (1965). A comparison of several item discrimination indices. Journal of Educational Measurement, 2, 69-76.https://doi.org/10.1111/j.1745-3984.1965.tb00393.x
  • Johnson, A.P. (1951). Notes on a suggested index of item validity: The U-L index. Journal of Educational Psychology, 42(8), 499-504. https://doi.org/10.1037/h0060855
  • Kelley, T.L. (1939). The selection of upper and lower groups for the validation of test items. Journal of Educational Psychology, 30(1), 17-24. https://doi.org/10.1037/h0057123
  • Livingston, S.A., & Dorans, N.J. (2004). A Graphical Approach to Item Analysis. ETS Research Report.
  • Magis, D., & Raiche, G. (2012). Random generation of response patterns under computerized adaptive testing with the R package catR. Journal of Statistical Software, 48(8), 1-31. https://doi.org/10.18637/jss.v048.i08
  • Martinkova, P., & Drabinova, A. (2018). ShinyItemAnalysis for teaching psychometrics and to enforce routine analysis of educational tests. The R Journal, 10(2), 503-515. https://doi.org/10.32614/RJ-2018-074
  • Moses, T. (2017). A review of developments and applications in item analysis. In Advancing Human Assessment (Eds. R. E. Bennett & M. von Davier). Springer Open.
  • Oosterhof, A.C. (1976). Similarity of various item discrimination indices. Journal of Educational Measurement, 13(2), 145-150. http://www.jstor.org/stable/1434235
  • Popham, W. J. (2014). Classroom Assessment: What teachers need to know? Pearson.
  • R Core Team (2022). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL: https://www.R-project.org/
  • Revelle, W. (2022) psych: Procedures for Personality and Psychological Research, Northwestern University, Evanston, Illinois, USA URL: https://CRAN.R-project.org/package=psych
  • Tabachnick, B.G., & Fidell, L.S. (2013). Using Multivariate Statistics. Pearson.
  • Wickham, H. (2016). ggplot2: Elegant Graphics for Data Analysis. Springer-Verlag.
  • Wickham, H., Francois, R., Henry, L., & Müller, K. (2022). dplyr: A Grammar of Data Manipulation. R package version 1.0.9 URL: https://CRAN.R-project.org/package=dplyr
  • Vonglao, P. (2017). Application of fuzzy logic to improve the Likert scale to measure latent variables. Kasetsart Journal of Social Sciences, 38(3), 337 344. https://doi.org/10.1016/j.kjss.2017.01.002

A novel approach for calculating the item discrimination for Likert type of scales

Year 2022, Volume: 9 Issue: 3, 772 - 786, 30.09.2022
https://doi.org/10.21449/ijate.1173356

Abstract

Item analysis is performed by developers as an integral part of the scale development process. Thus, items are excluded from the scale depending on the item analysis prior to the factor analysis. Existing item discrimination indices are calculated based on correlation, yet items with different response patterns are likely to have a similar item discrimination index. This study proposed a new item discrimination index that can be used in Likert type of scales and examined its effect on factor analysis results. For this purpose, simulative datasets were generated, and items were excluded from the analysis according to the .20, .30 and .35 item discrimination index criteria, and exploratory factor analysis was performed for a single factor. Accordingly, it was found that more variance could be explained by a single factor with fewer items compared to other discrimination indices when the .20 criterion of the slope coefficient was used as suggested in this study. Similar findings were obtained using the .35 criterion with other discrimination indices. In this context, it is recommended to use the slope coefficient as an additional discrimination index calculation method in the scale development process.

References

  • Beuchert, A.K., & Mendoza, J.L. (1979). A Monte Carlo comparison of ten item discrimination indices. Journal of Educational Measurement, 16(2), 109 117. http://www.jstor.org/stable/1434454
  • Crocker, L., & Algina, J. (2008). Introduction to Classical and Modern Test Theory. Cengage Learning.
  • Cum, S. (2021). Examining the discrimination of binary scored test items with ROC analysis. International Journal of Assessment Tools in Education, 8(4), 948-958. https://doi.org/10.21449/ijate.894851
  • Engelhart, M.D. (1965). A comparison of several item discrimination indices. Journal of Educational Measurement, 2, 69-76.https://doi.org/10.1111/j.1745-3984.1965.tb00393.x
  • Johnson, A.P. (1951). Notes on a suggested index of item validity: The U-L index. Journal of Educational Psychology, 42(8), 499-504. https://doi.org/10.1037/h0060855
  • Kelley, T.L. (1939). The selection of upper and lower groups for the validation of test items. Journal of Educational Psychology, 30(1), 17-24. https://doi.org/10.1037/h0057123
  • Livingston, S.A., & Dorans, N.J. (2004). A Graphical Approach to Item Analysis. ETS Research Report.
  • Magis, D., & Raiche, G. (2012). Random generation of response patterns under computerized adaptive testing with the R package catR. Journal of Statistical Software, 48(8), 1-31. https://doi.org/10.18637/jss.v048.i08
  • Martinkova, P., & Drabinova, A. (2018). ShinyItemAnalysis for teaching psychometrics and to enforce routine analysis of educational tests. The R Journal, 10(2), 503-515. https://doi.org/10.32614/RJ-2018-074
  • Moses, T. (2017). A review of developments and applications in item analysis. In Advancing Human Assessment (Eds. R. E. Bennett & M. von Davier). Springer Open.
  • Oosterhof, A.C. (1976). Similarity of various item discrimination indices. Journal of Educational Measurement, 13(2), 145-150. http://www.jstor.org/stable/1434235
  • Popham, W. J. (2014). Classroom Assessment: What teachers need to know? Pearson.
  • R Core Team (2022). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. URL: https://www.R-project.org/
  • Revelle, W. (2022) psych: Procedures for Personality and Psychological Research, Northwestern University, Evanston, Illinois, USA URL: https://CRAN.R-project.org/package=psych
  • Tabachnick, B.G., & Fidell, L.S. (2013). Using Multivariate Statistics. Pearson.
  • Wickham, H. (2016). ggplot2: Elegant Graphics for Data Analysis. Springer-Verlag.
  • Wickham, H., Francois, R., Henry, L., & Müller, K. (2022). dplyr: A Grammar of Data Manipulation. R package version 1.0.9 URL: https://CRAN.R-project.org/package=dplyr
  • Vonglao, P. (2017). Application of fuzzy logic to improve the Likert scale to measure latent variables. Kasetsart Journal of Social Sciences, 38(3), 337 344. https://doi.org/10.1016/j.kjss.2017.01.002
There are 18 citations in total.

Details

Primary Language English
Subjects Other Fields of Education
Journal Section Articles
Authors

Ümit Çelen This is me 0000-0001-6376-6167

Eren Can Aybek This is me 0000-0003-3040-2337

Early Pub Date August 31, 2022
Publication Date September 30, 2022
Submission Date May 21, 2022
Published in Issue Year 2022 Volume: 9 Issue: 3

Cite

APA Çelen, Ü., & Aybek, E. C. (2022). A novel approach for calculating the item discrimination for Likert type of scales. International Journal of Assessment Tools in Education, 9(3), 772-786. https://doi.org/10.21449/ijate.1173356

23823             23825             23824