Research Article

An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning

Volume: 17 Number: 72 November 20, 2017
TR EN

An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning

Abstract

Purpose : Position effects may influence examinees’ test performances in several ways and trigger other psychometric issues, such as Differential Item Functioning (DIF) .This study aims to supply test forms in which items in the test are ordered differently, depending on their difficulty level (from easy to difficult or difficult to easy), to determine whether the items in the test form result in DIF and whether a consistency exists between the methods for detecting DIF. Research Methods: Methods of Mantel Haenszel (MH) and Logistic Regression (LR) have been taken into consideration to identify whether the items in the tests involve DIF.The data of the work includes the answers of 300 students in the focal group and the reference group, who sat for three mathematics achievement tests. The data obtained from the tests have been statistically analyzed by using the R- 3.2.0. software program. Findings: Results of this study can be summarized with the following findings: “ordering the items differently, depending on their difficulty level, affects the probability of individuals in various groups answering the items correctly; also, LR and MH methods produce different results with respect to the items with DIF, which they have identified similar in terms of magnitude order in the amount of DIF. Implications for Research and Practice: In further test-developing studies, in order to identify if DIF emerges when giving the test form which has a different ordering of items, with regard to subjects and cognitive difficulty levels.

Keywords

References

  1. Barcikovski, R. S., & Olsen, H. (1975). Test item arrangement and adaptation level. The Journal of Psychology, 90(1), 87-93. doi: 10.1080/00223980.1975.9923929.
  2. Bertrand, R., & Boiteau, N. (2003). Comparing the stability of IRT-based and non IRT- based DIF methods in different cultural context using TIMSS data. (EDRS Reports – Research -143 , ED 476 924, TM 034 975). Quebec, Canada: NA. (ERIC Document Reproduction Service No. ED476924).
  3. Bulut, O. (2015). An empirical analysis of gender-based DIF due to test booklet effect. European Journal of Research on Education, 3(1), 7-16. Retrieved from http://iassr2.org/rs/030102.pdf
  4. Camilli, G., & Shepard, L. A. (1994). Methods for identifying biased test items. Hollywood: Sage Publication.
  5. Clauser, B. E., & Mazor, K. (1998). Using statistical procedures to identify differentially functioning test items. Educational Measurement, Issues and Practice. 17(1), 31–44. doi: 10.1111/j.1745-3992.1998.tb00619.x.
  6. Chiu, P. (2012,April) . The Effect of item position on state mathematics assessment. Paper presented at the Annual Meeting of the American Educational Research Association, Vancouver, Canada.
  7. Gomez‐Benito, J., & Navas‐Ara, M. J. (2000). A comparison of Ki‐kare, RFA and IRT based procedures in the detection of DIF. Quality ve Quantity, 34(1),17–31. doi: 10.1023/A:1004703709442.
  8. Hahne, J. (2008). Analyzing position effects within reasoning items using the LLTM for structurally Incomplete data. Psychology Science Quarterly, 50(3), 379-390. Retrieved from http://journaldatabase.info/articles/analyzing_position_effects_within.html

Details

Primary Language

English

Subjects

-

Journal Section

Research Article

Publication Date

November 20, 2017

Submission Date

November 20, 2017

Acceptance Date

-

Published in Issue

Year 2017 Volume: 17 Number: 72

APA
Balta, E., & Sunbul, S. O. (2017). An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning. Eurasian Journal of Educational Research, 17(72), 23-42. https://izlik.org/JA63YC93HX
AMA
1.Balta E, Sunbul SO. An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning. Eurasian Journal of Educational Research. 2017;17(72):23-42. https://izlik.org/JA63YC93HX
Chicago
Balta, Ebru, and Secil Omur Sunbul. 2017. “An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning”. Eurasian Journal of Educational Research 17 (72): 23-42. https://izlik.org/JA63YC93HX.
EndNote
Balta E, Sunbul SO (November 1, 2017) An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning. Eurasian Journal of Educational Research 17 72 23–42.
IEEE
[1]E. Balta and S. O. Sunbul, “An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning”, Eurasian Journal of Educational Research, vol. 17, no. 72, pp. 23–42, Nov. 2017, [Online]. Available: https://izlik.org/JA63YC93HX
ISNAD
Balta, Ebru - Sunbul, Secil Omur. “An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning”. Eurasian Journal of Educational Research 17/72 (November 1, 2017): 23-42. https://izlik.org/JA63YC93HX.
JAMA
1.Balta E, Sunbul SO. An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning. Eurasian Journal of Educational Research. 2017;17:23–42.
MLA
Balta, Ebru, and Secil Omur Sunbul. “An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning”. Eurasian Journal of Educational Research, vol. 17, no. 72, Nov. 2017, pp. 23-42, https://izlik.org/JA63YC93HX.
Vancouver
1.Ebru Balta, Secil Omur Sunbul. An Investigation of Ordering Test Items Differently Depending on Their Difficulty Level by Differential Item Functioning. Eurasian Journal of Educational Research [Internet]. 2017 Nov. 1;17(72):23-42. Available from: https://izlik.org/JA63YC93HX