Research Article
BibTex RIS Cite

An Iterative Method for Empirically-Based Q-Matrix Validation

Year 2018, , 248 - 262, 19.05.2018
https://doi.org/10.21449/ijate.407193

Abstract

In cognitive diagnosis modeling, the attributes required for each item are specified in the Q-matrix. The traditional way of constructing a Q-matrix based on expert opinion is inherently subjective, consequently resulting in serious validity concerns. The current study proposes a new validation method under the deterministic inputs, noisy “and” gate (DINA) model to empirically validate attribute specifications in the Q-matrix. In particular, an iterative procedure with a modified version of the sequential search algorithm is introduced. Simulation studies are conducted to compare the proposed method with existing parametric and nonparametric methods. Results show that the new method outperforms the other methods across the board. Finally, the method is applied to real data using fraction-subtraction data.

References

  • Akbay, L., Terzi, R., Kaplan, M., & Karaaslan, K. G. (2018). Expert-based attribute identification and validation: An application of cognitively diagnostic assessment. Journal on Mathematics Education, 9, 103-120.
  • Chiu, C. (2013). Statistical refinement of the Q-matrix in cognitive diagnosis. Applied Psychological Measurement 37, 598-618. Chiu, C., & Douglas, J. (2013). A nonparametric approach to cognitive diagnosis by proximity to ideal response patterns. Journal of Classification 37, 225-250.
  • Chiu, C., Douglas, J., & Li, X. (2009). Cluster analysis for cognitive diagnosis: Theory and applications. Psychometrika. 74, 633-665.
  • Chiu, C.-Y., & Köhn, H.-F. (2015). Consistency of cluster analysis for cognitive diagnosis: The DINO model and the DINA model revisited. Applied Psychological Measurement, 39, 465-479.
  • DeCarlo, L. T. (2011). On the analysis of fraction subtraction data: The DINA model, classification, latent class sizes, and the Q-matrix. Applied Psychological Measurement, 35(1), 8-26.
  • de la Torre, J. (2008). An empirically based method of Q-Matrix validation for the DINA model: development and applications. Journal of Educational Measurement, 45, 343-362.
  • de la Torre, J. (2009a). A cognitive diagnosis model for cognitively based multiple-choice options. Applied Psychological Measurement, 33, 163-183.
  • de la Torre, J. (2009b). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115-130.
  • de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76, 179-199.
  • de la Torre, J., & Chiu, C.-Y. (2016). A general method of empirical Q-matrix validation. Psychometrika, 81, 253-273. de la Torre, J., & Douglas, J. A. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69, 333-353.
  • de la Torre, J., & Douglas, J. A. (2008). Model evaluation and multiple strategies in cognitive diagnosis: An analysis of fraction subtraction data. Psychometrika, 73, 595-624.
  • Doornik, J. A. (2009). An object-oriented matrix programming language Ox 6. [Computer software]. London, UK: Timberlake Consultants Ltd.
  • Haertel, E. H. (1989). Using restricted latent class models to map the skill structure of achievement items. Journal of Educational Measurement, 26, 333–352.
  • Hou, L., de la Torre, J., & Nandakumar, R. (2014). Differential item functioning assessment in cognitive diagnostic modeling: Application of the wald test to investigate DIF in the DINA model. Journal of Educational Measurement, 51, 98-125.
  • Huo, Y., & de la Torre, J. (2014). Estimating a cognitive diagnostic model for multiple strategies via the EM algorithm. Applied Psychological Measurement, 38, 464-485.
  • Junker, B. W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258-272.
  • Kuo, B.-C., Pai, H.-S., & de la Torre, J. (2016). Modified cognitive diagnostic index and modified attribute-level discrimination index for test construction. Applied Psychological Measurement, 40, 315-330.
  • Liu, J., Xu, G., & Ying, Z. (2012). Data-driven learning of Q-matrix. Applied Psychological Measurement, 36, 609-618.
  • Liu, J., Ying, Z., & Zhang, S. (2015). A rate function approach to computerized adaptive testing for cognitive diagnosis. Psychometrika, 80, 468-490.
  • Mislevy, R. J. (1996). Test theory reconceived. Journal of Educational Measurement, 33, 379-416.
  • Park, Y. S., & Lee, Y.-S. (2014). An extension of the DINA model using covariates examining factors affecting response robability and latent classification. Applied Psychological Measurement, 38, 376-390.
  • R Core Team. (2014). R: A language and environment for statistical computing [Computer software manual]. Vienna, Austria. Retrieved from http://www.R-project.org/
  • Rojas, G., de la Torre, J., & Olea, J. (2012, April). Choosing between general and specific cognitive diagnosis models when the sample size is small. Paper presented at the annual meeting of the National Council of Measurement in Education, Vancouver, British Columbia, Canada.
  • Rupp, A., & Templin, J. (2008). Effects of Q-matrix misspecification on parameter estimates and misclassification rates in the DINA model. Educational and Psychological Measurement, 68, 78-98.
  • Tatsuoka, K. K. (1983). Rule-space: an approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345-354.
  • Tatsuoka, K. K. (1984). Analysis of errors in fraction addition and subtraction problems (Report No. NIE-G-81-0002). Urbana: Computer-based Education Research Laboratory, University of Illinois.
  • Terzi, R. (2017). New Q-matrix validation procedures. Retrieved from https://doi.org/doi:10.7282/T3571G5G
  • Zheng, Y., & Chiu, C.-Y. (2015). NPCD: The R package for nonparametric methods for cognitive diagnosis.

An Iterative Method for Empirically-Based Q-Matrix Validation

Year 2018, , 248 - 262, 19.05.2018
https://doi.org/10.21449/ijate.407193

Abstract

In cognitive diagnosis modeling, the attributes
required for each item are specified in the Q-matrix. The traditional way of
constructing a Q-matrix based on expert opinion is inherently subjective,
consequently resulting in serious validity concerns. The current study proposes
a new validation method under the deterministic inputs, noisy “and” gate (DINA)
model to empirically validate attribute specifications in the Q-matrix. In
particular, an iterative procedure with a modified version of the sequential
search algorithm is introduced. Simulation studies are conducted to compare the
proposed method with existing parametric and nonparametric methods. Results
show that the new method outperforms the other methods across the board.
Finally, the method is applied to real data using fraction-subtraction data.

References

  • Akbay, L., Terzi, R., Kaplan, M., & Karaaslan, K. G. (2018). Expert-based attribute identification and validation: An application of cognitively diagnostic assessment. Journal on Mathematics Education, 9, 103-120.
  • Chiu, C. (2013). Statistical refinement of the Q-matrix in cognitive diagnosis. Applied Psychological Measurement 37, 598-618. Chiu, C., & Douglas, J. (2013). A nonparametric approach to cognitive diagnosis by proximity to ideal response patterns. Journal of Classification 37, 225-250.
  • Chiu, C., Douglas, J., & Li, X. (2009). Cluster analysis for cognitive diagnosis: Theory and applications. Psychometrika. 74, 633-665.
  • Chiu, C.-Y., & Köhn, H.-F. (2015). Consistency of cluster analysis for cognitive diagnosis: The DINO model and the DINA model revisited. Applied Psychological Measurement, 39, 465-479.
  • DeCarlo, L. T. (2011). On the analysis of fraction subtraction data: The DINA model, classification, latent class sizes, and the Q-matrix. Applied Psychological Measurement, 35(1), 8-26.
  • de la Torre, J. (2008). An empirically based method of Q-Matrix validation for the DINA model: development and applications. Journal of Educational Measurement, 45, 343-362.
  • de la Torre, J. (2009a). A cognitive diagnosis model for cognitively based multiple-choice options. Applied Psychological Measurement, 33, 163-183.
  • de la Torre, J. (2009b). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115-130.
  • de la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76, 179-199.
  • de la Torre, J., & Chiu, C.-Y. (2016). A general method of empirical Q-matrix validation. Psychometrika, 81, 253-273. de la Torre, J., & Douglas, J. A. (2004). Higher-order latent trait models for cognitive diagnosis. Psychometrika, 69, 333-353.
  • de la Torre, J., & Douglas, J. A. (2008). Model evaluation and multiple strategies in cognitive diagnosis: An analysis of fraction subtraction data. Psychometrika, 73, 595-624.
  • Doornik, J. A. (2009). An object-oriented matrix programming language Ox 6. [Computer software]. London, UK: Timberlake Consultants Ltd.
  • Haertel, E. H. (1989). Using restricted latent class models to map the skill structure of achievement items. Journal of Educational Measurement, 26, 333–352.
  • Hou, L., de la Torre, J., & Nandakumar, R. (2014). Differential item functioning assessment in cognitive diagnostic modeling: Application of the wald test to investigate DIF in the DINA model. Journal of Educational Measurement, 51, 98-125.
  • Huo, Y., & de la Torre, J. (2014). Estimating a cognitive diagnostic model for multiple strategies via the EM algorithm. Applied Psychological Measurement, 38, 464-485.
  • Junker, B. W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258-272.
  • Kuo, B.-C., Pai, H.-S., & de la Torre, J. (2016). Modified cognitive diagnostic index and modified attribute-level discrimination index for test construction. Applied Psychological Measurement, 40, 315-330.
  • Liu, J., Xu, G., & Ying, Z. (2012). Data-driven learning of Q-matrix. Applied Psychological Measurement, 36, 609-618.
  • Liu, J., Ying, Z., & Zhang, S. (2015). A rate function approach to computerized adaptive testing for cognitive diagnosis. Psychometrika, 80, 468-490.
  • Mislevy, R. J. (1996). Test theory reconceived. Journal of Educational Measurement, 33, 379-416.
  • Park, Y. S., & Lee, Y.-S. (2014). An extension of the DINA model using covariates examining factors affecting response robability and latent classification. Applied Psychological Measurement, 38, 376-390.
  • R Core Team. (2014). R: A language and environment for statistical computing [Computer software manual]. Vienna, Austria. Retrieved from http://www.R-project.org/
  • Rojas, G., de la Torre, J., & Olea, J. (2012, April). Choosing between general and specific cognitive diagnosis models when the sample size is small. Paper presented at the annual meeting of the National Council of Measurement in Education, Vancouver, British Columbia, Canada.
  • Rupp, A., & Templin, J. (2008). Effects of Q-matrix misspecification on parameter estimates and misclassification rates in the DINA model. Educational and Psychological Measurement, 68, 78-98.
  • Tatsuoka, K. K. (1983). Rule-space: an approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345-354.
  • Tatsuoka, K. K. (1984). Analysis of errors in fraction addition and subtraction problems (Report No. NIE-G-81-0002). Urbana: Computer-based Education Research Laboratory, University of Illinois.
  • Terzi, R. (2017). New Q-matrix validation procedures. Retrieved from https://doi.org/doi:10.7282/T3571G5G
  • Zheng, Y., & Chiu, C.-Y. (2015). NPCD: The R package for nonparametric methods for cognitive diagnosis.
There are 28 citations in total.

Details

Primary Language English
Subjects Studies on Education
Journal Section Articles
Authors

Ragip Terzi

Jimmy De La Torre This is me

Publication Date May 19, 2018
Submission Date February 2, 2018
Published in Issue Year 2018

Cite

APA Terzi, R., & De La Torre, J. (2018). An Iterative Method for Empirically-Based Q-Matrix Validation. International Journal of Assessment Tools in Education, 5(2), 248-262. https://doi.org/10.21449/ijate.407193

23823             23825             23824