EN
TR
Examining the Dimensionality and Monotonicity of an Attitude Dataset based on the Item Response Theory Models
Abstract
In the current study, the factor structure of an attitude scale was analyzed by using the two different item response theory models that allow modeling non-monotonic item response curves. The current study utilized the two models to examine whether the two-factor solution of factor analysis may be caused by method effect, or by the failure of the analysis in describing and fitting the dataset because of the monotonicity assumption. This study was conducted on a dataset obtained from 355 undergraduate students who were studying at the Middle East Technical University. The data were obtained by carrying out the Attitude Scale Towards Foreign Languages as Medium of Instruction, which was developed by Kartal and Gülleroğlu (2015). The fit of the scale items to the generalized graded unfolding model was examined based on the item response curves, item parameters, item fit statistics and fit graphics. For Mokken scaling, scalability coefficients were calculated, dimensionality analyzes were conducted by using the Automated Item Selection Procedure. The monotonicity assumption was investigated based on the rest-score group methods. The results of the current study revealed that items of the attitude scale fit to the unidimensional models that do not assume monotone increasing item response curves for all items, while the factor analysis suggested a two-factor solution for the data. Researchers are recommended to utilize statistical techniques that can identify any possible violation of the monotonicity assumption and model items having non-monotonic response curves to examine dimensionality of their data.
Keywords
References
- Carter, N. T., & Dalal, D. K. (2010). An ideal point account of the JDI work satisfaction scale. Personality and Individual Differences, 49, 743-748.
- Chernyshenko, O. S., Stark, S. E., Drasgow, F., & Roberts, J. S. (2007). Constructing personality scales under the assumptions of an ideal point response process: Toward increasing the flexibility of personality measures. Psychological Assessment, 19(1), 88-106.
- Chernyshenko, O. S., Stark, S., Chan, K. Y., Drasgow, F., & Williams, B. (2001). Fitting item response theory models to two personality inventories: Issues and insights. Multivariate Behavioral Research, 36(4), 523-562.
- DiStefano, C., & Motl, R. W. (2006) Further investigating method effects associated with negatively worded items on self-report surveys. Structural Equation Modeling, 13(3), 440-464.
- Gorsuch, R. L. (1983). Factor analysis. Saunders.
- Gu, H., Wen, Z., & Fan, X. (2015). The impact of wording effect on reliability and validity of the Core Self-Evaluation Scale (CSES): A bi-factor perspective. Personality and Individual Differences, 83, 142-147.
- Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1985). Principles and applications of item response theory. SAGE Publications, Inc.
- Horan, P. M., DiStefano, C., & Motl, R. W. (2003) Wording effects in self-esteem scales: Methodological artifact or response style?. Structural Equation Modeling, 10(3), 435-455.
Details
Primary Language
English
Subjects
Studies on Education
Journal Section
Research Article
Publication Date
June 10, 2021
Submission Date
April 28, 2020
Acceptance Date
March 12, 2021
Published in Issue
Year 2021 Volume: 8 Number: 2
APA
Kartal, S., & Mor Dirlik, E. (2021). Examining the Dimensionality and Monotonicity of an Attitude Dataset based on the Item Response Theory Models. International Journal of Assessment Tools in Education, 8(2), 296-309. https://doi.org/10.21449/ijate.728362
AMA
1.Kartal S, Mor Dirlik E. Examining the Dimensionality and Monotonicity of an Attitude Dataset based on the Item Response Theory Models. Int. J. Assess. Tools Educ. 2021;8(2):296-309. doi:10.21449/ijate.728362
Chicago
Kartal, Seval, and Ezgi Mor Dirlik. 2021. “Examining the Dimensionality and Monotonicity of an Attitude Dataset Based on the Item Response Theory Models”. International Journal of Assessment Tools in Education 8 (2): 296-309. https://doi.org/10.21449/ijate.728362.
EndNote
Kartal S, Mor Dirlik E (June 1, 2021) Examining the Dimensionality and Monotonicity of an Attitude Dataset based on the Item Response Theory Models. International Journal of Assessment Tools in Education 8 2 296–309.
IEEE
[1]S. Kartal and E. Mor Dirlik, “Examining the Dimensionality and Monotonicity of an Attitude Dataset based on the Item Response Theory Models”, Int. J. Assess. Tools Educ., vol. 8, no. 2, pp. 296–309, June 2021, doi: 10.21449/ijate.728362.
ISNAD
Kartal, Seval - Mor Dirlik, Ezgi. “Examining the Dimensionality and Monotonicity of an Attitude Dataset Based on the Item Response Theory Models”. International Journal of Assessment Tools in Education 8/2 (June 1, 2021): 296-309. https://doi.org/10.21449/ijate.728362.
JAMA
1.Kartal S, Mor Dirlik E. Examining the Dimensionality and Monotonicity of an Attitude Dataset based on the Item Response Theory Models. Int. J. Assess. Tools Educ. 2021;8:296–309.
MLA
Kartal, Seval, and Ezgi Mor Dirlik. “Examining the Dimensionality and Monotonicity of an Attitude Dataset Based on the Item Response Theory Models”. International Journal of Assessment Tools in Education, vol. 8, no. 2, June 2021, pp. 296-09, doi:10.21449/ijate.728362.
Vancouver
1.Seval Kartal, Ezgi Mor Dirlik. Examining the Dimensionality and Monotonicity of an Attitude Dataset based on the Item Response Theory Models. Int. J. Assess. Tools Educ. 2021 Jun. 1;8(2):296-309. doi:10.21449/ijate.728362
Cited By
ÖLÇEKLERDE İFADE ETKİSİNİN FARKLI BOYUTLULUK ANALİZLERİYLE İNCELENMESİ
Uludağ Üniversitesi Eğitim Fakültesi Dergisi
https://doi.org/10.19171/uefad.1033284On the monotonicity of the residual heteroscedasticity item response model
Behaviormetrika
https://doi.org/10.1007/s41237-023-00212-3Exploring the suitability of self-reports to gain insights into teachers’ technological pedagogical knowledge
Studies in Educational Evaluation
https://doi.org/10.1016/j.stueduc.2024.101402Promoting a valid question model for measuring computational thinking skills based on confirmatory factor analysis and Rasch model
Cogent Education
https://doi.org/10.1080/2331186X.2025.2505339