Research Article
BibTex RIS Cite

An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory approach

Year 2018, Volume: 3 Issue: 4, 55 - 67, 01.10.2018
https://doi.org/10.24331/ijere.452555

Abstract

In this study, the researchers assessed comparability of
item statistics of 2017 basic education certificate mathematics examination of
National Examinations Council (NECO) through Classical Test Theory (CTT) and
Item Response Theory (IRT) measurement frameworks. The study adopted
instrumentation design. A 60-item NECO basic education certificate education
mathematics objective test paper I was administered to 978 basic nine
examinees, randomly selected from Osogbo and Olorunda Local Government Area,
Osun State, Nigeria. The responses of the examinees to the test data were
analysed using Marginal Maximum Likelihood Estimation of JMETRIK software.  The result showed that the test data obey the
assumption of unidimensionality of 3-parameter logistic model and Classical
Test Theory measurement framework deleted more items 33 (55%) compare to IRT
measurement framework 12 (20%). Also, it was observed that item statistics from
the two contrasting frameworks (CTT and IRT) were not comparable. Moreover,
further analysis showed that there was low correlation among the item
statistics index. The implication of this is that NECO should jettison the use
of Classical Test Theory and embrace utilization of Item Response Theory
framework during their test development and item analysis. 

References

  • Adamu, H. A. (2007). State of learning science and mathematics in Katsina state secondary schools. A Report Submitted to the Department of Research Statistics, Katsina State Ministry of Education, 11, 8-10. Asikhia, O. A., (2010). Students and teachers’ perception of the causes of poor academic performance in Ogun state secondary schools: Implications for counseling for National development. European Journal of Social sciences, 13(2), 28-36. Adedoyin, O. O., Nenty, H. J.& Chilisa, B. (2008). Investigating the invariance of item difficulty parameter estimates based on CTT and IRT. Educational Research and Review, 3, 83-93. Adegoke, B.A. (2013). Comparison of item statistics of physics achievement test using classical Test theory and item response theory frameworks. Journal of Education and Practice, 4(22), 87 – 96. Ayanwale, M.A. & Adeleke, J.O. (2016). Relational analysis of personal variables and marking skills of national examinations council’s examiners. African Journal of Pedagogy, Kampala International University College, Tanzania, 8, 25-38. Ayanwale, M.A. (2017). Efficacy of Item Response Theory in score ranking and concurrent validity of dichotomous and polytomous response mathematics achievement test in Osun State, Nigeria. Unpublished ph.D thesis. Institute of Education. University of Ibadan. Baker, F.B. (2001). The basic of item response theory. Test calibration. ERIC Clearing House on Assessment and Evaluation. University of Maryland, College Park, MD, 136-330. Courville, T.R. (2005). An empirical comparison of item response theory and classical test theory item/person statistics. Unpublished Doctoral Thesis, Texas A & M University. Crocker, L. & Algina, J. (1986). Introduction to classical test and modern test theory. New York: Holt, Rinehart and Winston. De Ayala, R.J. (2009). The theory and practice of item response theory. 1st ed. New York, NY: The Guilford Press. 144-200. Fan, X. (1998). Item response theory and classical test theory: An empirical comparison of their item/person statistics. Educational andPsychological Measurement, 58, 357-381. Hambleton, R. K., & Jones, R. W. (1993). Comparison of classical test theory and item response theory and their applications to test development. Education Measurement: Issues and Practice, 12(3) , 38-47. Kelly, T.L. (1939). Selection of upper and lower groups for the validation of test items. Journal of Educational Psychology, 30, 17-26. Lord, F.M. (1980). Application of Item Response Theory to Practical Testing Problems. Hillsdale, NJ: Lawrence Erlbaum. McDonald, R. P. (1997). Normal-ogive multidimensional model. In W.J. vander Linden & R.K. Hambleton (Eds.), Handbook of modern item response theory. New York: Springer McDonald, R. P. (1999). Test theory: A unified treatment. Mahwah, NJ: LEA Publisher. McDonald, R.P. & Paunonen, S. (2002). A Monte Carlo Comparison of Item and Person Statistics based on Item Response Theory versus Classical Test Theory. Journal of Educational and PsychologicalMeasurement,62, 921-943. Metibemu, M.A. (2016). Comparison of classical test theory and item response theory in the development and scoring of senior secondary school physics tests in Ondo State. Unpublished ph.D thesis. Institute of Education. University of Ibadan. Nukhet, C. (2002). A study of raven standard progressive matrices test’s item measures under classic and item response models: An empirical comparison. Ankara University, Journal of Faculty ofEducational Science, 35(2), 71-79. Ojerinde, D., Popoola, K., Ojo, F. & Onyeneho, O. P. (2012). Introduction to item response theory:Parameter models, estimation and application. Goshen Print media Ltd Ojerinde, D. (2013). Classical test theory (CTT) vs item response theory (IRT): An evaluation of comparability of item analysis results. Lecture Presentation at the Institute of Education, University of Ibadan. Rupp, A. A. 2009. Item response theory modeling with Bilog-MG and Multilog for windows. International Journal of Testing, 3(4), 365-384. Tanaka, J.S. (1993). Multifaceted conceptions of fit in structural equation models. In K.A. Bollen & J.S. Long (Eds.), Testing structural equation models. Newbury Park, CA: Sage. Tomkowickz, J.T. & Wright, K.R. (2007). Investigation of the effect of test equating and scoring methods on item parameter estimates and student ability scores. A Paper presented at the annual conference of American Educational Research Association, Chicago, April 10.
Year 2018, Volume: 3 Issue: 4, 55 - 67, 01.10.2018
https://doi.org/10.24331/ijere.452555

Abstract

References

  • Adamu, H. A. (2007). State of learning science and mathematics in Katsina state secondary schools. A Report Submitted to the Department of Research Statistics, Katsina State Ministry of Education, 11, 8-10. Asikhia, O. A., (2010). Students and teachers’ perception of the causes of poor academic performance in Ogun state secondary schools: Implications for counseling for National development. European Journal of Social sciences, 13(2), 28-36. Adedoyin, O. O., Nenty, H. J.& Chilisa, B. (2008). Investigating the invariance of item difficulty parameter estimates based on CTT and IRT. Educational Research and Review, 3, 83-93. Adegoke, B.A. (2013). Comparison of item statistics of physics achievement test using classical Test theory and item response theory frameworks. Journal of Education and Practice, 4(22), 87 – 96. Ayanwale, M.A. & Adeleke, J.O. (2016). Relational analysis of personal variables and marking skills of national examinations council’s examiners. African Journal of Pedagogy, Kampala International University College, Tanzania, 8, 25-38. Ayanwale, M.A. (2017). Efficacy of Item Response Theory in score ranking and concurrent validity of dichotomous and polytomous response mathematics achievement test in Osun State, Nigeria. Unpublished ph.D thesis. Institute of Education. University of Ibadan. Baker, F.B. (2001). The basic of item response theory. Test calibration. ERIC Clearing House on Assessment and Evaluation. University of Maryland, College Park, MD, 136-330. Courville, T.R. (2005). An empirical comparison of item response theory and classical test theory item/person statistics. Unpublished Doctoral Thesis, Texas A & M University. Crocker, L. & Algina, J. (1986). Introduction to classical test and modern test theory. New York: Holt, Rinehart and Winston. De Ayala, R.J. (2009). The theory and practice of item response theory. 1st ed. New York, NY: The Guilford Press. 144-200. Fan, X. (1998). Item response theory and classical test theory: An empirical comparison of their item/person statistics. Educational andPsychological Measurement, 58, 357-381. Hambleton, R. K., & Jones, R. W. (1993). Comparison of classical test theory and item response theory and their applications to test development. Education Measurement: Issues and Practice, 12(3) , 38-47. Kelly, T.L. (1939). Selection of upper and lower groups for the validation of test items. Journal of Educational Psychology, 30, 17-26. Lord, F.M. (1980). Application of Item Response Theory to Practical Testing Problems. Hillsdale, NJ: Lawrence Erlbaum. McDonald, R. P. (1997). Normal-ogive multidimensional model. In W.J. vander Linden & R.K. Hambleton (Eds.), Handbook of modern item response theory. New York: Springer McDonald, R. P. (1999). Test theory: A unified treatment. Mahwah, NJ: LEA Publisher. McDonald, R.P. & Paunonen, S. (2002). A Monte Carlo Comparison of Item and Person Statistics based on Item Response Theory versus Classical Test Theory. Journal of Educational and PsychologicalMeasurement,62, 921-943. Metibemu, M.A. (2016). Comparison of classical test theory and item response theory in the development and scoring of senior secondary school physics tests in Ondo State. Unpublished ph.D thesis. Institute of Education. University of Ibadan. Nukhet, C. (2002). A study of raven standard progressive matrices test’s item measures under classic and item response models: An empirical comparison. Ankara University, Journal of Faculty ofEducational Science, 35(2), 71-79. Ojerinde, D., Popoola, K., Ojo, F. & Onyeneho, O. P. (2012). Introduction to item response theory:Parameter models, estimation and application. Goshen Print media Ltd Ojerinde, D. (2013). Classical test theory (CTT) vs item response theory (IRT): An evaluation of comparability of item analysis results. Lecture Presentation at the Institute of Education, University of Ibadan. Rupp, A. A. 2009. Item response theory modeling with Bilog-MG and Multilog for windows. International Journal of Testing, 3(4), 365-384. Tanaka, J.S. (1993). Multifaceted conceptions of fit in structural equation models. In K.A. Bollen & J.S. Long (Eds.), Testing structural equation models. Newbury Park, CA: Sage. Tomkowickz, J.T. & Wright, K.R. (2007). Investigation of the effect of test equating and scoring methods on item parameter estimates and student ability scores. A Paper presented at the annual conference of American Educational Research Association, Chicago, April 10.
There are 1 citations in total.

Details

Primary Language English
Journal Section Articles
Authors

Musa Adekunle Ayanwale This is me

Joshua Oluwatoyin Adeleke This is me

Titilayo Iyabode Mamadelo This is me

Publication Date October 1, 2018
Published in Issue Year 2018 Volume: 3 Issue: 4

Cite

APA Adekunle Ayanwale, M., Oluwatoyin Adeleke, J., & Iyabode Mamadelo, T. (2018). An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory approach. International Journal of Educational Research Review, 3(4), 55-67. https://doi.org/10.24331/ijere.452555
AMA Adekunle Ayanwale M, Oluwatoyin Adeleke J, Iyabode Mamadelo T. An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory approach. IJERE. October 2018;3(4):55-67. doi:10.24331/ijere.452555
Chicago Adekunle Ayanwale, Musa, Joshua Oluwatoyin Adeleke, and Titilayo Iyabode Mamadelo. “An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory Approach”. International Journal of Educational Research Review 3, no. 4 (October 2018): 55-67. https://doi.org/10.24331/ijere.452555.
EndNote Adekunle Ayanwale M, Oluwatoyin Adeleke J, Iyabode Mamadelo T (October 1, 2018) An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory approach. International Journal of Educational Research Review 3 4 55–67.
IEEE M. Adekunle Ayanwale, J. Oluwatoyin Adeleke, and T. Iyabode Mamadelo, “An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory approach”, IJERE, vol. 3, no. 4, pp. 55–67, 2018, doi: 10.24331/ijere.452555.
ISNAD Adekunle Ayanwale, Musa et al. “An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory Approach”. International Journal of Educational Research Review 3/4 (October 2018), 55-67. https://doi.org/10.24331/ijere.452555.
JAMA Adekunle Ayanwale M, Oluwatoyin Adeleke J, Iyabode Mamadelo T. An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory approach. IJERE. 2018;3:55–67.
MLA Adekunle Ayanwale, Musa et al. “An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory Approach”. International Journal of Educational Research Review, vol. 3, no. 4, 2018, pp. 55-67, doi:10.24331/ijere.452555.
Vancouver Adekunle Ayanwale M, Oluwatoyin Adeleke J, Iyabode Mamadelo T. An Assessment of Item Statistics Estimates of Basic Education Certificate Examination through Classical Test Theory and Item Response Theory approach. IJERE. 2018;3(4):55-67.

International Journal of Educational Research Review (ISSN:2458-9322)
International Journal of Educational Research Review publishes scholarly articles that are of general significance to the education research community and that come from a wide range of areas. The International Journal of Educational Research Review aims to provide a forum for scholarly understanding of the field of education. Articles focus upon concepts, research, review and practices that emphasizes the intersections between education research and other fields, raises new questions, and develops innovative approaches to our understandings of pressing issues.
The range of topics covered in the International Journal of Educational Research Review include; read full aims and scope; https://www.ijere.com/page/journal-scopes
Peer Review Policy: All articles in this journal will undergo initial editor screening, rigorous double-anonymous peer review, and review by the editorial board.
Double-blind peer review
International Journal of Educational Research Review follows a double-blind reviewing procedure. This means that the author will remain anonymous to the reviewers throughout peer review. It is the responsibility of the author to anonymize the manuscript and any associated materials. read full Peer Review Policy; https://www.ijere.com/page/peer-review-policy
Ownership and management
IJERE journal is hosted by Dergipark/Tubitak. IJERE Journal is published with the support of Sakarya University Faculty of Education/TURKEY.
Governing body
Editor in Chief
Dr. Serhat Arslan, Gazi University, serhatarslan@gazi.edu.tr , Turkey 
read full Governing body /Ownership and management/ Editorial board member ; https://www.ijere.com/page/editorial-board
Copyright and licensing
Copyright Statement
Copyright violation is an important, and possibly related, ethical issue. Authors should check their manuscripts for possible breaches of copyright law (e.g., where permissions are needed for quotations, artwork or tables taken from other publications or from other freely available sources on the Internet) and secure the necessary permissions before submission to International Journal of Educational Research Review.
read full Copyright and licensing; https://www.ijere.com/page/copyright-and-licensing-open-access-statement