Research Article
BibTex RIS Cite
Year 2019, Volume: 3 Issue: 1, 4 - 17, 30.06.2019

Abstract

References

  • American Federation of Teachers (AFT). (1999). What TIMSS tells us about mathematics achievement, curriculum, and instruction. In Educational Issues Policy Brief, 10, 2-10.Washington, DC: AFT Educational Issues Department.
  • Birenbaum, M., Nasser, F., & Tatsuoka, C. (2007). Effects of gender and ethnicity on fourth graders’ knowledge states in mathematics. International Journal of Mathematical Education in Science and Technology, 38, 301-319.
  • Birenbaum, M., Tatsuoka, C., & Yamada, T. (2004). Diagnostic assessment in TIMSS-R: Between countries and within-country comparisons of eighth graders’ mathematics performance. Studies in Educational Evaluation, 30, 151-173.
  • Chen, Y., Thompson, M., Gorin, J. S., & Tatsuoka K. K. (2008). Cross-Cultural validity of the TIMSS-1999 mathematics test: Verification of a Cognitive Model. International Journal of Testing, 8, 251-271.
  • De la Torre, J. (2008). An empirically-based method of Q-matrix validation for the DINA model: Development and applications. Journal of Educational Measurement, 45, 343-362.
  • De la Torre, J. (2009). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115-130.
  • De la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76, 179-199.
  • Dogan, E., & Tatsuoka, K. K. (2008). An international comparison using a diagnostic testing model: Turkish students’ profile of mathematical skills on TIMSS-R. Educational Studies in Mathematics, 68, 263-272.
  • Doornik, J. A. (2002). Object-Oriented Matrix Programming Using Ox (Version 3.1) [Computer software]. London: Timberlake Consultants Press.
  • Foy, P., & Olson, J. F. (2009). TIMSS 2007 user guide for the international database. Chestnut Hill, MA: IEA.
  • Haertel, E. H. (1989). Using restricted latent class models to map the skill structure of achievement items. Journal of Educational Measurement, 26, 333-352.
  • Hambleton, R. K. (2005). Issues, designs, and technical guidelines for adapting tests in multiple languages. In R. K. Hambleton, P. Merenda, & C. D. Spielberger (Eds.). Hillsdale, NJ: Lawrence Erlbaum.
  • Junker, B.W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258–272.
  • Lee, Park, & Taylan (2011). Cognitive diagnostic modeling of attribute mastery in Massachusetts, Minnesota, and the U.S. national sample using the TIMSS 2007.
  • International Journal of Testing, 11, 144-177.
  • Mullis, I.V.S., Martin, M.O., Foy, P., & Arora, A. (2012). TIMSS 2011 International Results in Mathematics. Chestnut Hill, MA: IEA.
  • Olson, J. F., Martin,M. O., & Mullis, I. V. S. (2009). TIMSS 2007 technical report. Chestnut Hill, MA: IEA.
  • Tatsuoka, K. K. (2009). Cognitive assessment: An introduction to the rule space method. Routledge, NY.
  • Tatsuoka, K. K. (1983). Rule-space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345-354.

An Application of Cognitive Diagnosis Modeling in TIMSS: A Comparison of Intuitive Definitions of Q-Matrices

Year 2019, Volume: 3 Issue: 1, 4 - 17, 30.06.2019

Abstract


Detection of students’ ability levels is one of the
common aims in educational studies. Cognitive Diagnosis Modeling approach has
been used recently for the purpose of ability level detection by defined
Q-matrices. To evaluate students’ strengths and weaknesses, determine their
mastery skills, and design instructions and interventions in learning
process, Cognitive Diagnosis Modeling approach can be helpful. Cognitive
Diagnosis Modeling is an alternative approach to Item Response Theory, and
provides more information using multiple fine-grained skills in problem
solving process rather than order students on a latent proficiency continuum
This paper aims to use Cognitive Diagnosis Modeling (CDM) in order to
investigate the definition of a Q-matrix across the cognitive skills of
different years and countries in Trends in International Mathematics and
Science Study (TIMSS). There is a subjective way in defining Q-matrices, an
intuitive definition of Q-matrices, for this purpose, an application of
building Q-matrices under specific Cognitive Diagnosis Models, from a set of
expert proposed attributes is examined. The proposed attributes are used to
build Q-matrices for TIMSS mathematics questions across its cycles, and
across different nations.

References

  • American Federation of Teachers (AFT). (1999). What TIMSS tells us about mathematics achievement, curriculum, and instruction. In Educational Issues Policy Brief, 10, 2-10.Washington, DC: AFT Educational Issues Department.
  • Birenbaum, M., Nasser, F., & Tatsuoka, C. (2007). Effects of gender and ethnicity on fourth graders’ knowledge states in mathematics. International Journal of Mathematical Education in Science and Technology, 38, 301-319.
  • Birenbaum, M., Tatsuoka, C., & Yamada, T. (2004). Diagnostic assessment in TIMSS-R: Between countries and within-country comparisons of eighth graders’ mathematics performance. Studies in Educational Evaluation, 30, 151-173.
  • Chen, Y., Thompson, M., Gorin, J. S., & Tatsuoka K. K. (2008). Cross-Cultural validity of the TIMSS-1999 mathematics test: Verification of a Cognitive Model. International Journal of Testing, 8, 251-271.
  • De la Torre, J. (2008). An empirically-based method of Q-matrix validation for the DINA model: Development and applications. Journal of Educational Measurement, 45, 343-362.
  • De la Torre, J. (2009). DINA model and parameter estimation: A didactic. Journal of Educational and Behavioral Statistics, 34, 115-130.
  • De la Torre, J. (2011). The generalized DINA model framework. Psychometrika, 76, 179-199.
  • Dogan, E., & Tatsuoka, K. K. (2008). An international comparison using a diagnostic testing model: Turkish students’ profile of mathematical skills on TIMSS-R. Educational Studies in Mathematics, 68, 263-272.
  • Doornik, J. A. (2002). Object-Oriented Matrix Programming Using Ox (Version 3.1) [Computer software]. London: Timberlake Consultants Press.
  • Foy, P., & Olson, J. F. (2009). TIMSS 2007 user guide for the international database. Chestnut Hill, MA: IEA.
  • Haertel, E. H. (1989). Using restricted latent class models to map the skill structure of achievement items. Journal of Educational Measurement, 26, 333-352.
  • Hambleton, R. K. (2005). Issues, designs, and technical guidelines for adapting tests in multiple languages. In R. K. Hambleton, P. Merenda, & C. D. Spielberger (Eds.). Hillsdale, NJ: Lawrence Erlbaum.
  • Junker, B.W., & Sijtsma, K. (2001). Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Applied Psychological Measurement, 25, 258–272.
  • Lee, Park, & Taylan (2011). Cognitive diagnostic modeling of attribute mastery in Massachusetts, Minnesota, and the U.S. national sample using the TIMSS 2007.
  • International Journal of Testing, 11, 144-177.
  • Mullis, I.V.S., Martin, M.O., Foy, P., & Arora, A. (2012). TIMSS 2011 International Results in Mathematics. Chestnut Hill, MA: IEA.
  • Olson, J. F., Martin,M. O., & Mullis, I. V. S. (2009). TIMSS 2007 technical report. Chestnut Hill, MA: IEA.
  • Tatsuoka, K. K. (2009). Cognitive assessment: An introduction to the rule space method. Routledge, NY.
  • Tatsuoka, K. K. (1983). Rule-space: An approach for dealing with misconceptions based on item response theory. Journal of Educational Measurement, 20, 345-354.
There are 19 citations in total.

Details

Primary Language English
Journal Section Research Article
Authors

Derya Evran 0000-0002-6960-342X

Publication Date June 30, 2019
Published in Issue Year 2019 Volume: 3 Issue: 1

Cite

APA Evran, D. (2019). An Application of Cognitive Diagnosis Modeling in TIMSS: A Comparison of Intuitive Definitions of Q-Matrices. International Journal of Modern Education Studies, 3(1), 4-17.