Review Article
BibTex RIS Cite

Ulusal Sınav Konseyi Ekonomi Çoktan Seçmeli Maddelerinin Madde Özellikleri: Bir Madde Tepki Kuramı İncelenmesi

Year 2024, Volume: 2 Issue: 1, 8 - 18, 29.04.2024

Abstract

Çalışma, 1, 2 ve 3 parametreli lojistik modelleri kullanarak ekonomi çoktan seçmeli maddelerinin Madde Tepki Kuramı (MTK) parametre kestirimlerini araştırmayı amaçlamaktadır. Çalışmada keşfedici araştırma tasarımı benimsenerek, çok aşamalı örnekleme yöntemi ile 1500 ‘lise III ekonomi’ son sınıf öğrencisi örneklem olarak alınmıştır. Ekonomi Başarı Testleri (EAT) ve öğrencilerin optik okuyucuda (OMR) yer alan yanıtları veri toplama aracı olarak kullanılmıştır. Toplanan veriler, madde parametre kestirimleri için R istatistik yazılımındaki Mirt paketi kullanılarak kodlanmış ve analiz edilmiştir. Sonuçlar, 2 ve 3 Parametreli Lojistik modeller kullanılarak kestirilen ayırt edicilik indekslerine göre, sırasıyla 28 ve 25 maddenin zayıf maddeler olduğunu, 32 ve 35 maddenin ise yeterli maddeler olduğunu göstermiştir. Ayrıca 1, 2 ve 3 Parametreli Lojistik modeller kullanılarak kestirilen madde güçlük indekslerine göre, sırasıyla 23, 25 ve 35 maddenin kolay olduğu; 35, 33 ve 23 maddenin orta derecede zor olduğu; 2 maddenin ise zor olduğu belirlenmiştir. Bunun yanında, 3 Parametreli Lojistik modelin sonuçları, yalnızca 9 maddenin şansla doğru yanıtlanabilir olduğunu, 51 maddenin ise şansla doğru yanıtlanabilir olmadığını göstermiştir. Çalışmada, NECO Economics'in çoktan seçmeli maddelerine ilişkin MTK’ya ilişkin kestirimlerine göre, ortalama ayırt edicilik indekslerine sahip, orta derecede zor maddeler içerdiği ve maddelerin büyük ölçüde şansla doğru yanıtlamaya açık olmadığı sonucuna varılmıştır. Bu nedenle, test uzmanlarının ve sınav kurumlarının, test maddelerinin kalite kontrolü için madde parametrelerini değerlendirmek üzere düzenli olarak MTK kestirimlerini yapmaları önerilir.

References

  • Adebowale, O. F. (2007). A study of Differential item functioning (DIF) in physics examinations in selected secondary schools in Lagos State [Master thesis]. Obafemi Awolowo University, Ile-Ife, Nigeria.
  • Adedoyin, O. O., & Mokobi, T. (2013). Using irt psychometric analysis in examining the quality of junior certificate mathematics multiple choice examination test items. International Journal of Asian Social Science, 3(4), 992-1011.
  • Adetutu, O. M., & Lawal, H. B. (2023). Applications of item response theory models to assess item properties and students’ abilities in dichotomous responses items. Open Journal of Educational Development (OJED), 3(1), 1-19.
  • Ajeigbe, T. O., & Oderinde O. I. (2021). Assessing unidimensionality and item parameter estimates of four different paper types of english language multiple-choice tests using three-parameter model. African Journal of Theory and Practice of Educational Assessment (AJTPEA), 10(1), 1-18.
  • Baker, F. B., & Kim, S. H. (2004). Item response theory: parameter estimation techniques (2nd ed.). Taylor and Francis.
  • Birnbaum, A. (1968). Some latent trait models and their use in inferring an examinee’s ability. In F.M. Lord & M.R. Novick (Eds.), Statistical theories of mental test scores (pp. 397-472). Addison-Wesley.
  • De Ayala, R. J. (2009). The theory and practice of item response theory. The Guilford Press.
  • DeMars, C. (2010). Item response theory. Oxford University Press.
  • Doan, C. H., Le, V. A., & Pham, U. H. (2016). Applying three-parameter logistic model in validating the level of difficulty, discrimination and guessing of items in a multiple-choice test. Ho Chi Minh City University of Education Journal of Science, 7(8), 174-184.
  • Ebel, R. L., & Frisbie, D. A. (1991). Essentials of educational measurement (5th ed.). Prentice Hall, Engelwood Cliffs.
  • Embretson, S., & Reise, S. P. (2000). Item response theory for psychologists. Lawrence Erlbaum Associates.
  • Gierl, M. J., Bisanz, J. Bisanz, G. L., Boughton, K. A., & Khaliq, S. N. (2001). Illustrating the utility of differential bundle functioning analyses to identify and interpret group differences on achievement tests. Educational Measurement: Issues and Practice, 20, 26-36.
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Sage.
  • Hambleton, R.K., & Swaminathan, H. (1985). Item response theory: Principals and applications. Kluwer Academic Publishers.
  • Henning, G. (1987). A guide to language testing: Development, evaluation research. New Berry House Publisher.
  • Izard, J. (2005a). Overview of test construction: quantitative research methods in educational planning. International Institute for Educational Planning/UNESCO Paris, France http://www.sacmeq.org and http://www.unesco.org/iiep
  • Izard, J. (2005b). Trial testing and item analysis in test construction: Quantitative research methods in educational planning. International Institute for Educational Planning/UNESCO Paris, France http://www.sacmeq.org and http://www.unesco.org/iiep
  • Liao, W., Ho, R., & Yen, Y. (2012). The four-parameter logistic item response theory model as a robust method of estimating ability despite aberrant responses. Social Behavior and Personality, 40(10), 1679-1694. https://doi.org/10.2224/sbp.2012.40.10.1679
  • Linden, A. (2018). Review of Tenko Raykov and George Marcoulides’s: A course in item response theory and modeling with stata. The Stata Journal: Promoting Communications on Statistics and Stata, 18(2), 485-488. https://doi.org/10.1177/1536867X1801800213
  • Loken, E., & Rulison, K. L. (2010). Estimation of a four-parameter item response theory model. The British Journal of Mathematical and Statistical Psychology, 63(3), 509-25. https://doi.org/10.1348/000711009X474502
  • Magis, D. (2013). A note on the item information function of the four-parameter logistic model. Applied Psychological Measurement, 37(4), 304-315.
  • McBride, N. L. (2001). An item response theory analysis of the scales from the international personality item pool and the neo personality inventory-revised [Doctoral dissertation]. Virginia Tech.
  • Nenty, H. J. (2015, November). Conjugal relationship between research and measurement. A keynote address delivered at 1st EARNIA conference in Cameroon.
  • Oguguo, B. C. E., & Lotobi, R. A. (2019). Parameters of basic science test item’s of 2011 basic education certificate examination using item response theory (irt) approach in Delta State, Nigeria. European Journal of Educational Sciences, EJES. 6(1), 22-36. http://dx.doi.org/10.19044/ejes.v6no1a2
  • Ogunsakin, I. B., & Shogbesan, Y. O. (2018). Item response theory (irt): A modern statistical theory for solving measurement problem in 21st century. International Journal of Scientific Research in Education (IJSRE), 11(3B), 627-635.
  • Olatunji, D. S. (2007). Effects of number of options on psychometic properties of multiple choice tests in economics [M.Ed thesis]. University of Ilorin, Ilorin.
  • Olutola, A. T. (2015). Item difficulty and discrimination indices of multiple choice biology tests. Liceo Journal of Higher Education Research, 11(1), 16-30. https://doi:http://dx.doi.org/10.7828/ljher.v11i1.890
  • R Core Team (2014). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org/
  • Raykov, T., & Marcoulides, G. A. (2018). A Course in item response theory and modeling with stata. Stata Press College Station.
  • Rulison, K. L., & Loken, E. (2009). I’ve fallen and i can’t get up: can high-ability students recover from early mistakes in CAT? Applied Psychological Measurement, 33(2), 83-101. https://doi.org/10.1177/0146621608324023
  • Shamshad, B., & Siddiqui, J. S. (2020). Testing procedure for item response probabilities of 2class latent model. Mehran University Research Journal of Engineering and Technology, 39(3), 657-667. https://doi.org/10.22581/muet1982.2003.20
  • Shogbesan, Y. O. (2017). Effect of test facets on the construct validity of economics achievement tests in osun state secondary schools [M.A.Ed. thesis].Obafemi Awolowo University, Ile-Ife, Nigeria.
  • Shogbesan, Y. O. (2021). Sensitivity of economics multiple-choice item parameters to item compromise among secondary school students in Ogun State, Nigeria. [PhD. Thesis]. Obafemi Awolowo University, Ile-Ife, Nigeria.
  • Stout, W. (2005). DIMTEST (Version 2.0) [Computer Software]. The William Stout Institute for Measurement.
  • Thomas, M. L., Brown, G. G., Gur, R. C., Moore, T. M., Patt, V. M., Risbrough, V. B., & Baker, D. G. (2018). A signal detection–item response theory model for evaluating neuropsychological measures. Journal of clinical and experimental neuropsychology, 40(8), 745-760.
  • Văn Cảnh, N. (2021). Applying the item response theory with two-parameter, three-parameter models in the evaluation of multiple choice tests. Tạp chí Khoa học Đại học Đồng Tháp, 10(4), 17-28.
  • Wang, T. , & Hanson, A. (2001, April). Development and an item response model that incorporates response time [Conference presentation]. Annual meeting of the American Education Research Association in Settle.
  • Wiberg, M. (2004).Classical test theory vs. item response theory: An evaluation of the theory test in the swedish driving-license test (No. 50). Kluwer Academic Publications.

Item Characteristics of National Examination Council’s Economics Multiple-Choice Items: An Item Response Theory Exploration

Year 2024, Volume: 2 Issue: 1, 8 - 18, 29.04.2024

Abstract

The study explored the IRT parameter estimates of Economics multiple-choice items using the 1, 2 and 3 parameter logistic models. The study adopted the explorative research design with a sample size of 1500 senior secondary school III Economics students’ selected using multi-stage sampling procedure. The Economics Achievement Tests (EAT) and students’ responses as contained in the optical mark reader (OMR) serve as instruments. Data collected was coded and analysed using Mirt package in R statistical software for item parameter calibrations. The results showed that the discrimination index estimated using the 2 Parameter Logistic (PL) and 3PL models indicated that 28 items and 25 items respectively are poor items while 32 items and 35 items are considered good items respectively. Also, the difficulty index estimated using the 1, 2 and 3 PL models shows that 23 items, 25 items and 35 items respectively are easy items, 35 items, 33 items and 23 items are moderately difficult items while 2 items are considered difficult items. Furthermore, the results of the 3PL model shows that only 9 items are considered to be vulnerable to guessing with 51 items not vulnerable to guessing. The study concluded that the IRT psychometric estimates of NECO Economics multiple-choice items possessed moderately difficult items with an average discrimination indices and items majorly found not vulnerable to guessing. It therefore recommended that test experts and examination bodies should regularly consider the use of IRT psychometric estimations to evaluate item parameters for quality check of test items.

References

  • Adebowale, O. F. (2007). A study of Differential item functioning (DIF) in physics examinations in selected secondary schools in Lagos State [Master thesis]. Obafemi Awolowo University, Ile-Ife, Nigeria.
  • Adedoyin, O. O., & Mokobi, T. (2013). Using irt psychometric analysis in examining the quality of junior certificate mathematics multiple choice examination test items. International Journal of Asian Social Science, 3(4), 992-1011.
  • Adetutu, O. M., & Lawal, H. B. (2023). Applications of item response theory models to assess item properties and students’ abilities in dichotomous responses items. Open Journal of Educational Development (OJED), 3(1), 1-19.
  • Ajeigbe, T. O., & Oderinde O. I. (2021). Assessing unidimensionality and item parameter estimates of four different paper types of english language multiple-choice tests using three-parameter model. African Journal of Theory and Practice of Educational Assessment (AJTPEA), 10(1), 1-18.
  • Baker, F. B., & Kim, S. H. (2004). Item response theory: parameter estimation techniques (2nd ed.). Taylor and Francis.
  • Birnbaum, A. (1968). Some latent trait models and their use in inferring an examinee’s ability. In F.M. Lord & M.R. Novick (Eds.), Statistical theories of mental test scores (pp. 397-472). Addison-Wesley.
  • De Ayala, R. J. (2009). The theory and practice of item response theory. The Guilford Press.
  • DeMars, C. (2010). Item response theory. Oxford University Press.
  • Doan, C. H., Le, V. A., & Pham, U. H. (2016). Applying three-parameter logistic model in validating the level of difficulty, discrimination and guessing of items in a multiple-choice test. Ho Chi Minh City University of Education Journal of Science, 7(8), 174-184.
  • Ebel, R. L., & Frisbie, D. A. (1991). Essentials of educational measurement (5th ed.). Prentice Hall, Engelwood Cliffs.
  • Embretson, S., & Reise, S. P. (2000). Item response theory for psychologists. Lawrence Erlbaum Associates.
  • Gierl, M. J., Bisanz, J. Bisanz, G. L., Boughton, K. A., & Khaliq, S. N. (2001). Illustrating the utility of differential bundle functioning analyses to identify and interpret group differences on achievement tests. Educational Measurement: Issues and Practice, 20, 26-36.
  • Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Sage.
  • Hambleton, R.K., & Swaminathan, H. (1985). Item response theory: Principals and applications. Kluwer Academic Publishers.
  • Henning, G. (1987). A guide to language testing: Development, evaluation research. New Berry House Publisher.
  • Izard, J. (2005a). Overview of test construction: quantitative research methods in educational planning. International Institute for Educational Planning/UNESCO Paris, France http://www.sacmeq.org and http://www.unesco.org/iiep
  • Izard, J. (2005b). Trial testing and item analysis in test construction: Quantitative research methods in educational planning. International Institute for Educational Planning/UNESCO Paris, France http://www.sacmeq.org and http://www.unesco.org/iiep
  • Liao, W., Ho, R., & Yen, Y. (2012). The four-parameter logistic item response theory model as a robust method of estimating ability despite aberrant responses. Social Behavior and Personality, 40(10), 1679-1694. https://doi.org/10.2224/sbp.2012.40.10.1679
  • Linden, A. (2018). Review of Tenko Raykov and George Marcoulides’s: A course in item response theory and modeling with stata. The Stata Journal: Promoting Communications on Statistics and Stata, 18(2), 485-488. https://doi.org/10.1177/1536867X1801800213
  • Loken, E., & Rulison, K. L. (2010). Estimation of a four-parameter item response theory model. The British Journal of Mathematical and Statistical Psychology, 63(3), 509-25. https://doi.org/10.1348/000711009X474502
  • Magis, D. (2013). A note on the item information function of the four-parameter logistic model. Applied Psychological Measurement, 37(4), 304-315.
  • McBride, N. L. (2001). An item response theory analysis of the scales from the international personality item pool and the neo personality inventory-revised [Doctoral dissertation]. Virginia Tech.
  • Nenty, H. J. (2015, November). Conjugal relationship between research and measurement. A keynote address delivered at 1st EARNIA conference in Cameroon.
  • Oguguo, B. C. E., & Lotobi, R. A. (2019). Parameters of basic science test item’s of 2011 basic education certificate examination using item response theory (irt) approach in Delta State, Nigeria. European Journal of Educational Sciences, EJES. 6(1), 22-36. http://dx.doi.org/10.19044/ejes.v6no1a2
  • Ogunsakin, I. B., & Shogbesan, Y. O. (2018). Item response theory (irt): A modern statistical theory for solving measurement problem in 21st century. International Journal of Scientific Research in Education (IJSRE), 11(3B), 627-635.
  • Olatunji, D. S. (2007). Effects of number of options on psychometic properties of multiple choice tests in economics [M.Ed thesis]. University of Ilorin, Ilorin.
  • Olutola, A. T. (2015). Item difficulty and discrimination indices of multiple choice biology tests. Liceo Journal of Higher Education Research, 11(1), 16-30. https://doi:http://dx.doi.org/10.7828/ljher.v11i1.890
  • R Core Team (2014). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org/
  • Raykov, T., & Marcoulides, G. A. (2018). A Course in item response theory and modeling with stata. Stata Press College Station.
  • Rulison, K. L., & Loken, E. (2009). I’ve fallen and i can’t get up: can high-ability students recover from early mistakes in CAT? Applied Psychological Measurement, 33(2), 83-101. https://doi.org/10.1177/0146621608324023
  • Shamshad, B., & Siddiqui, J. S. (2020). Testing procedure for item response probabilities of 2class latent model. Mehran University Research Journal of Engineering and Technology, 39(3), 657-667. https://doi.org/10.22581/muet1982.2003.20
  • Shogbesan, Y. O. (2017). Effect of test facets on the construct validity of economics achievement tests in osun state secondary schools [M.A.Ed. thesis].Obafemi Awolowo University, Ile-Ife, Nigeria.
  • Shogbesan, Y. O. (2021). Sensitivity of economics multiple-choice item parameters to item compromise among secondary school students in Ogun State, Nigeria. [PhD. Thesis]. Obafemi Awolowo University, Ile-Ife, Nigeria.
  • Stout, W. (2005). DIMTEST (Version 2.0) [Computer Software]. The William Stout Institute for Measurement.
  • Thomas, M. L., Brown, G. G., Gur, R. C., Moore, T. M., Patt, V. M., Risbrough, V. B., & Baker, D. G. (2018). A signal detection–item response theory model for evaluating neuropsychological measures. Journal of clinical and experimental neuropsychology, 40(8), 745-760.
  • Văn Cảnh, N. (2021). Applying the item response theory with two-parameter, three-parameter models in the evaluation of multiple choice tests. Tạp chí Khoa học Đại học Đồng Tháp, 10(4), 17-28.
  • Wang, T. , & Hanson, A. (2001, April). Development and an item response model that incorporates response time [Conference presentation]. Annual meeting of the American Education Research Association in Settle.
  • Wiberg, M. (2004).Classical test theory vs. item response theory: An evaluation of the theory test in the swedish driving-license test (No. 50). Kluwer Academic Publications.
There are 38 citations in total.

Details

Primary Language English
Subjects Item Response Theory
Journal Section Research Articles
Authors

Yusuf Olayinka Shogbesan 0000-0002-1741-1370

Early Pub Date April 28, 2024
Publication Date April 29, 2024
Submission Date March 4, 2024
Acceptance Date April 3, 2024
Published in Issue Year 2024 Volume: 2 Issue: 1

Cite

APA Shogbesan, Y. O. (2024). Item Characteristics of National Examination Council’s Economics Multiple-Choice Items: An Item Response Theory Exploration. Journal of Psychometric Research, 2(1), 8-18.

Journal of Psychometric Research is licensed under a Creative Commons Attribution-NonCommercial 4.0 (CC BY-NC 4.0). 

30434