Research Article
BibTex RIS Cite

Öğrenci Özelliklerinin Cinsiyete Dayalı Değişen Madde Fonksiyonuna Etkisi

Year 2017, Volume: 8 Issue: 4, 373 - 390, 28.12.2017
https://doi.org/10.21031/epod.333451

Abstract

Araştırmada, sosyoekonomik düzey
ve okuma becerisinin cinsiyete dayalı DMF’nin ortaya çıkmasına etkisi amaçlanmıştır.
Bu kapsamda PISA 2015 fen uygulamasında yer alan maddelerin ele alınan dokuz
ülkede cinsiyete göre değişen madde fonksiyonu (DMF) gösterip göstermediğinin
belirlenmiştir. Araştırma kapsamında bilgisayar tabanlı uygulamada yer alan bir
fen madde kümesi ele alınmıştır. Araştırmaya dahil edilen ülkeler bilgisayar
tabanlı uygulamaya katılan ülkeler arasında başarı sıralamalarına göre
seçilmiştir. Analizlerde çoklu göstergeler çoklu nedenler modeli (MIMIC) yöntemi
kullanılmıştır. MIMIC ile DMF analizleri belirli gruplarda maddelerin örtük
özelliği eşit şekilde ölçüp ölçmediğini belirlemek amacıyla, tam ve
sınırlandırılmış modellerin uyumlarının karşılaştırılmasını içerir. Analizler
iki aşamada gerçekleştirilmiştir. İlk aşamada maddelerin cinsiyet grupları
arasında DMF gösterip göstermediği incelenmiştir. Daha sonra sosyoekonomik
düzey ve okuma becerisi değişkenleri sırasıyla modele eklenerek söz konusu
değişkenlerin cinsiyetten kaynaklı DMF’ye etkisi incelenmiştir. Araştırmanın
bulgularına göre seçilen ülkelerin tamamında cinsiyetle ilişkili DMF’li
maddeler yer almakta ve DMF’li madde sayıları 2 ile 6 arasında değişmektedir.
Ülkelerin dördünde modele eklenen değişkenler cinsiyete ilişkin DMF’li madde
sayısını manidar şekilde etkilememiştir. Ancak diğer dört ülkede söz konusu
değişkenlerin modele eklenmesi DMF’li madde sayısını azaltmıştır. Cinsiyete
dayalı DMF’li madde sayısını azaltan değişkenler her bir ülke kapsamında ayrı
ayrı tartışılmıştır.

References

  • Ackerman, T. A. (1992). A didactic explanation of item bias, item impact, and item validity from a multidimensional perspective. Journal of Educational Measurement, 29, 67–91. doi:10.1111/j.1745-3984.1992.tb00368.x
  • Allalouf, A., Hambleton, R. K., & Sireci, S. G. (1999). Identifying the causes of DIF in translated verbal items. Journal of Educational Measurement,36(3), 185–198.
  • Asil, M. ve Gelbal, S. (2012). PISA öğrenci anketinin kültürler arası eşdeğerliği. Eğitim ve Bilim, 37(166), 236-249.
  • Atalay Kabasakal, K., ve Kelecioğlu, H. (2012). PISA 2006 öğrenci anketinde yer alan maddelerin değişen madde fonksiyonu açısından incelenmesi. Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 45(2), 77-96.
  • Barr, A. B. (2015). Family socioeconomic status, family health, and changes in students' math achievement across high school: A mediational model. Social Science & Medicine, 140, 27-34.
  • Budgell, G. R., Raju, N. S., & Quartetti, D. A. (1995). Analysis of differential item functioning in translated assessment instruments. Applied Psychological Measurement, 19(4), 309-321
  • Camilli, G. (1993). The case against item bias techniques based on internal criteria: Do item bias procedures obscure test fairness issues? The use of differential item functioning statistics: A discussion of current practice and future implications. In P. W. Holland & H. Wainer (Eds.), Differential item functioning (pp. 397-413). Hillsdale, NJ: Lawrence Erlbaum
  • Camilli, G., & Shepard, L. A. (1994). Methods for identifying biased test items. London: Sage publications.
  • Candell, G. L., & Drasgow, F. (1988). An iterative procedure for linking metrics and assessing item bias in item response theory. Applied Psychological Measurement, 12, 253-260.
  • Chun, S. (2014). Using MIMIC methods to detect and ıdentify sources of DIF among multiple groups. Unpublished master thesis. University of South Florida, USA Clauser, B., Mazor, K., & Hambleton, R. K. (1993). The effects of purification of the matching criterion on the identification of DIF using the Mantel–Haenszel procedure. Applied Measurement in Education, 6, 269-279.
  • Coleman, J. S., Campbell, E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfeld, F. D., & York, R. (1966). Equality of educational opportunity. Washington, DC, 1066-5684.
  • Cromley, G. J. (2009). Reading achievement and science proficiency: International comparisons from the programme on ınternational student assessment. Reading Psychology, 30(2), 89-118.
  • Demps, D. L., & Onwuegbuzie, A. J. (2001). The relationship between eighthgrade reading scores and achievement on the Georgia High School Graduation Test. Research in the Schools, 8(2), 1–9.
  • Dorans, N.J., & Holland, P.W. (1993). DIF detection and description: Mantel-Haenszel and standardization. In P.W. Holland & H. Wainer (Eds.), Differential item functioning (pp. 35–66). Hilsdale: Lawrence Erlbaum Associates.
  • Dorans, N. J., & Kulick, E. (1986). Demonstrating the utility of the standardization approach to assessing unexpected differential item performance on the Scholastic Aptitude Test. Journal of Educational Measurement, 23, 355-368.
  • Ercikan, K., & Koh, K. (2005). Examining the construct comparability of the English and French versions of timss. International Journal of Testing, 5(1), 23 - 35.
  • Fleishman, J. A., Spector, W. D., & Altman, B. M. (2002). Impact of differential item functioning on age and gender differences in functional disability. Journal of Gerontology: Social Sciences,57(5), 275-284
  • Finch, H. (2005). The MIMIC model as a method for detecting DIF: Comparison with Mantel-Haenszel, SIBTEST, and the IRT likelihood ratio. Applied Psychological Measurement, 29, 278-295.
  • French, B. F., & Maller, S. J. (2007). Iterative purification and effect size use with logistic regression for differential item functioning detection. Educational and Psychological Measurement, 67, 373-393.
  • Gallo, J. J., Anthony, J. C., & Muthe´n, B. O. (1994). Age differences in the symptoms of depression: A latent trait analysis. Journal of Gerontology: Psychological Sciences, 49, 251-264.
  • Glöckner-Rist, A., & Hoitjink, H. (2003). The best of both worlds: Factor analysis of dichotomous data using item response theory and structural equation modeling. Structural Equation Modeling, 10, 544-565.
  • Grisaya, A., & Monseur, C. (2007). Measuring the equivalence of item difficulty in the various versions of an international test. Studies in Educational Evaluation, 33(1), 69-86
  • Hambleton, R. K. , Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage.
  • Hecht, S. A., Burgess, S. R., Torgesen, J.K., Wagner, R. K., & Rashotte, C.A (2000).Explaining social class differences in growth of reading skills from beginning kindergarten through fourth-grade: The role of phonological awareness, rate of access, and print knowledge. Reading and Writing, 12(1),99-128.
  • Holland, W. P., & Thayer, D. T. (1988). Differential item performance and the Mantel–Haenszel procedure.In H. Wainer & H. I. Braun (Eds.), Test validity (pp. 129-145). Hillsdale, NJ: Lawrence Erlbaum.
  • Huang, X., Wilson, M., & Wang, L. (2016). Exploring plausible causes of differential item functioning in the PISA science assessment: language, curriculum or culture. Educational Psychology, 36 (2), 378-390. http://dx.doi.org/10.1080/01443410.2014.946890
  • Husin, M. (2014). Assesiıng mathematical competence in second language: exploring dif evidences from PISA Malaysian data. Unpublished master thesis, University of Wisconsion, Milwaukee.
  • Hyde, J.,S. & Lin, M. (1988). Gender differences in verbal ability: A meta-analysis. Psychological Bulletin, 104(1), 53-69. Doi: http://dx.doi.org/10.1037/0033-2909.104.1.53
  • Joreskog, K., & A. S. Goldberger (1975). Estimation of a model with a multiple indicators and multiple causes of a single latent variable. Journal of American Statistical Association,70, 631-639.
  • Lan, M. C. (2014). Exploring gender differential item functioning (DIF) in eight grade mathematics items for the United States and Taiwan. Unpublished doctoral disstertation.University of Washington.
  • Le, L. T. (2009). Investigating gender differential item functioning across countries and test languages for PISA science items. International Journal of Testing, 9(2), 122-133. http://dx.doi.org/10.1080/15305050902880769
  • Levine, D. W., Bowen, D. J., Kaplan, R. M., Kripke, D. F., Naughton, M. J., & Shumaker, S. A. (2003). Factor structure and measurement invariance of the women’s health initiative ınsomnia rating scale.Psychological Assessment,15, 123-136.
  • Logan S.& Johnstone, R. (2009). Gender differences in reading ability and attitudes: examining where these differences lie. Journal of Research in Reading, 32(2), 199-214 doi: 10.1111/j.1467-9817.2008.01389.x
  • Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale, NJ:Lawrence Erlbaum.
  • Lyons-Thomas, J., Sandilands, D., D. & Ercikan, K. (2014). Gender differential item functioning in mathematics in four ınternational jurisdictions. Education and Science, 39 (172), 20-32.
  • MacIntosh, R., & Hashim, S. (2003). Variance estimation for converting MIMIC model parameters to IRT parameters in DIF analysis. Applied Psychological Measurement, 27, 372-379.
  • Mellenbergh, J. G. (1989). Item bias and item response theory. International Journal of Educational Research, 13(2),127-143.
  • Meredith, W., & Millsap, R. (1992). On the misuse of manifest variables in the detection of measurement bias. Psychometrika, 57(2), 289-311.
  • Millsap, R., & Meredith, W. (1992). Inferential conditions in the statistical detection of measurement bias. Applied Psychological Measurement, 16(4), 389-402.
  • Muthén, B. O. (1985). A method for studying the homogeneity of test items with respect to other relevant variables. Journal of Educational Statistics, 10, 121-132.
  • Muthén, B. O., Kao, C. F., & Burstein, L. (1991). Instructionally sensitive psychometrics: Application of a new IRT-based detection technique to mathematics achievement test items. Journal of Educational Measurement, 28(1), 1-22.
  • Muthén, L., K. & Muthén, B., O. (2010). M plus Statistical Analysis with Latent Variables User’s Guide, Sixth Edition. Los Angeles, CA: Muthén & Muthén.
  • Nolen, S. B. (2003). Learning environment, motivation, and achievement in high school science. Journal of Research in Science Teaching, 40(4), 347–368.
  • OECD (2015). PISA 2015 Technical report. OECD: http://www.oecd.org/pisa/data/2015-technical-report/
  • Oort, F. J. (1998). Simulation study of item bias detection with restricted factor analysis. Structural Equation Modeling, 5, 107-124.
  • O’Reilly, T., & McNamara, D. S. (2007). The impact of science knowledge, reading skill, and reading strategy knowledge on more traditional “high-stakes” measures of high school students’ science achievement. American Educational Research Journal, 44(1), 161–196.
  • NCES (2003). NAEP validity studies: An agenda for NEAP validity studies (Report No. 2003-07). Retrieved from https://nces.ed.gov/pubs2003/200307.pdf
  • Schmidt, W. H., Cogan, L. S., & McKnight, C. C. (2011). Equality of educational opportunity: Myth or reality in U.S. schooling?. American Educator 34(4), 12-19.
  • Shealy, R. T., & Stout, W. F. (1993). A model-biased standardization approach that separates true bias/DIF from group ability differences and detects test bias/DTF as well as item bias/DIF. Psychometrika, 58, 159-194.
  • Shih, C. L., & Wang, W. C. (2009). Differential item functioning detection using multiple indicators, multiple causes method with a pure short anchor. Applied Psychological Measurement, 33(3),184-199.
  • Sireci, S. G., & Swaminathan, H. (1996). Evaluating Translation Equivalence: So What's the Big DIF? Paper presented at the AERA, Ellenville, NY.
  • Swaminathan, H., & Rogers, H. J. (1990). Detecting differential item functioning using logistic regression procedures. Journal of Educational Measurement, 27, 361-370.
  • Thissen, D., Steinberg, L., & Wainer, H. (1988). Use of item response theory in the study of group differences in trace lines. In H. Wainer & H. Braun (Eds.), Test validity (pp. 147-169). Hillsdale, NJ: Lawrence Erlbaum.
  • Wainer, H., Sireci, S., & Thissen, D. (1991). Differential testlet functioning: Definitions and detection. Journal of Educational Measurement, 28, 197-219.
  • Walker, C. M., Zhang, B. & Surber, J. (2008) Using a multidimensional differential item functioning framework to determine if reading ability affects student performance in mathematics. Applied Measurement in Education, 21(2), 162-181, DOI:10.1080/08957340801926201
  • Wang, W. C., Shih, C. L., & Yang, C.C. (2009). The MIMIC method with scale purification procedure for detecting differential item functioning. Educational and Psychological Measurement, 69(5), 713-731.
  • Wang, W. C., & Su, Y. H. (2004a). Effects of average signed area between two item characteristic curves and test purification procedures on the DIF detection via the Mantel-Haenszel method. Applied Measurement in Education, 17, 113-144.
  • Wang, W.-C., & Su, Y.-H. (2004b). Factors influencing the Mantel and generalized Mantel-Haenszel methods for the assessment of differential item functioning in polytomous items. Applied Psychological Measurement, 28, 450-480.
  • Wang, W.C., & Yeh, Y.L. (2003). Effects of anchor item methods on differential item functioning detection with the likelihood ratio test. Applied Psychological Measurement, 27, 479-498.
  • Welch, C. J., & Miller, T. R. (1995). Assessing differential item functioning in direct writing assessments: Problems and an example. Journal of Educational Measurement, 32, 163-178.
  • White, K. R. (1982). The relation between socioeconomic status and academic achievement, Psychological Bulletin, 91(3), 461-481 Wright, B. D. & Stone, M. H. (1979). Best test design. Chicago: MESA Press.
  • Woods, C. M. (2009). Evaluation of MIMIC-model methods for DIF testing with comparison to two-group analysis. Multivariate Behavioral Research, 44(1), 1-27.
  • Woods, C., M., & Grimm, K. J. (2011). Testing for nonuniform differential ıtem functioning with multiple ındicator multiple cause models. Applied Psychological Measurement, 35(5) 339–361. http://dx.doi.org/10.1177/0146621611405984.
  • Wu, A. D., & Ercikan, K. (2006). Using multiple-variable matching to identify culturalsources of differential item functioning. International Journal of Testing, 6(3),287300.
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (dif):logistic regression modeling as a unitary framework for binary and likert-type(ordinal)item scores. Ottawa on Directorate of Human Resources Research and Evaluation, Department of National Defense.

The Effect of Background Variables on Gender Related Differential Item Functioning

Year 2017, Volume: 8 Issue: 4, 373 - 390, 28.12.2017
https://doi.org/10.21031/epod.333451

Abstract

In this
study, the effect of socioeconomic status and reading ability, on the presence
of gender-related DIF were examined. For this purpose, presence of differential
item functioning (DIF) between gender groups in PISA 2015 science items in nine
selected countries were detected. One cluster of science items from
computer-based assessment (CBA) was taken into consideration. The countries
were selected among the ones that implemented CBA, on the basis of their rank
in science achievement. Multiple Indicator Multiple Causes method (MIMIC) was
used for DIF analyses. DIF analysis in the MIMIC involves fit comparisons of
both full and reduced models to determine if the items can measure the latent
trait equally among the specified groups. The MIMIC analysis was conducted in
two steps. First, the items were tested for exhibiting DIF between gender
groups. Then the socioeconomic status and the reading ability were added to the
model to test gender-related DIF items and their effects, respectively.
According to the results of the study, gender-related DIF appeared in all of
the selected countries with between two and six items. In four of the countries,
none of the selected variables significantly affected the presence of
gender-related DIF. Instead, in the remaining countries, the number of
gender-related DIF items was decreased by adding selected variables to the
model. The effects of variables which reduced the number of gender-related DIF
items were discussed within each country.

References

  • Ackerman, T. A. (1992). A didactic explanation of item bias, item impact, and item validity from a multidimensional perspective. Journal of Educational Measurement, 29, 67–91. doi:10.1111/j.1745-3984.1992.tb00368.x
  • Allalouf, A., Hambleton, R. K., & Sireci, S. G. (1999). Identifying the causes of DIF in translated verbal items. Journal of Educational Measurement,36(3), 185–198.
  • Asil, M. ve Gelbal, S. (2012). PISA öğrenci anketinin kültürler arası eşdeğerliği. Eğitim ve Bilim, 37(166), 236-249.
  • Atalay Kabasakal, K., ve Kelecioğlu, H. (2012). PISA 2006 öğrenci anketinde yer alan maddelerin değişen madde fonksiyonu açısından incelenmesi. Ankara Üniversitesi Eğitim Bilimleri Fakültesi Dergisi, 45(2), 77-96.
  • Barr, A. B. (2015). Family socioeconomic status, family health, and changes in students' math achievement across high school: A mediational model. Social Science & Medicine, 140, 27-34.
  • Budgell, G. R., Raju, N. S., & Quartetti, D. A. (1995). Analysis of differential item functioning in translated assessment instruments. Applied Psychological Measurement, 19(4), 309-321
  • Camilli, G. (1993). The case against item bias techniques based on internal criteria: Do item bias procedures obscure test fairness issues? The use of differential item functioning statistics: A discussion of current practice and future implications. In P. W. Holland & H. Wainer (Eds.), Differential item functioning (pp. 397-413). Hillsdale, NJ: Lawrence Erlbaum
  • Camilli, G., & Shepard, L. A. (1994). Methods for identifying biased test items. London: Sage publications.
  • Candell, G. L., & Drasgow, F. (1988). An iterative procedure for linking metrics and assessing item bias in item response theory. Applied Psychological Measurement, 12, 253-260.
  • Chun, S. (2014). Using MIMIC methods to detect and ıdentify sources of DIF among multiple groups. Unpublished master thesis. University of South Florida, USA Clauser, B., Mazor, K., & Hambleton, R. K. (1993). The effects of purification of the matching criterion on the identification of DIF using the Mantel–Haenszel procedure. Applied Measurement in Education, 6, 269-279.
  • Coleman, J. S., Campbell, E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Weinfeld, F. D., & York, R. (1966). Equality of educational opportunity. Washington, DC, 1066-5684.
  • Cromley, G. J. (2009). Reading achievement and science proficiency: International comparisons from the programme on ınternational student assessment. Reading Psychology, 30(2), 89-118.
  • Demps, D. L., & Onwuegbuzie, A. J. (2001). The relationship between eighthgrade reading scores and achievement on the Georgia High School Graduation Test. Research in the Schools, 8(2), 1–9.
  • Dorans, N.J., & Holland, P.W. (1993). DIF detection and description: Mantel-Haenszel and standardization. In P.W. Holland & H. Wainer (Eds.), Differential item functioning (pp. 35–66). Hilsdale: Lawrence Erlbaum Associates.
  • Dorans, N. J., & Kulick, E. (1986). Demonstrating the utility of the standardization approach to assessing unexpected differential item performance on the Scholastic Aptitude Test. Journal of Educational Measurement, 23, 355-368.
  • Ercikan, K., & Koh, K. (2005). Examining the construct comparability of the English and French versions of timss. International Journal of Testing, 5(1), 23 - 35.
  • Fleishman, J. A., Spector, W. D., & Altman, B. M. (2002). Impact of differential item functioning on age and gender differences in functional disability. Journal of Gerontology: Social Sciences,57(5), 275-284
  • Finch, H. (2005). The MIMIC model as a method for detecting DIF: Comparison with Mantel-Haenszel, SIBTEST, and the IRT likelihood ratio. Applied Psychological Measurement, 29, 278-295.
  • French, B. F., & Maller, S. J. (2007). Iterative purification and effect size use with logistic regression for differential item functioning detection. Educational and Psychological Measurement, 67, 373-393.
  • Gallo, J. J., Anthony, J. C., & Muthe´n, B. O. (1994). Age differences in the symptoms of depression: A latent trait analysis. Journal of Gerontology: Psychological Sciences, 49, 251-264.
  • Glöckner-Rist, A., & Hoitjink, H. (2003). The best of both worlds: Factor analysis of dichotomous data using item response theory and structural equation modeling. Structural Equation Modeling, 10, 544-565.
  • Grisaya, A., & Monseur, C. (2007). Measuring the equivalence of item difficulty in the various versions of an international test. Studies in Educational Evaluation, 33(1), 69-86
  • Hambleton, R. K. , Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage.
  • Hecht, S. A., Burgess, S. R., Torgesen, J.K., Wagner, R. K., & Rashotte, C.A (2000).Explaining social class differences in growth of reading skills from beginning kindergarten through fourth-grade: The role of phonological awareness, rate of access, and print knowledge. Reading and Writing, 12(1),99-128.
  • Holland, W. P., & Thayer, D. T. (1988). Differential item performance and the Mantel–Haenszel procedure.In H. Wainer & H. I. Braun (Eds.), Test validity (pp. 129-145). Hillsdale, NJ: Lawrence Erlbaum.
  • Huang, X., Wilson, M., & Wang, L. (2016). Exploring plausible causes of differential item functioning in the PISA science assessment: language, curriculum or culture. Educational Psychology, 36 (2), 378-390. http://dx.doi.org/10.1080/01443410.2014.946890
  • Husin, M. (2014). Assesiıng mathematical competence in second language: exploring dif evidences from PISA Malaysian data. Unpublished master thesis, University of Wisconsion, Milwaukee.
  • Hyde, J.,S. & Lin, M. (1988). Gender differences in verbal ability: A meta-analysis. Psychological Bulletin, 104(1), 53-69. Doi: http://dx.doi.org/10.1037/0033-2909.104.1.53
  • Joreskog, K., & A. S. Goldberger (1975). Estimation of a model with a multiple indicators and multiple causes of a single latent variable. Journal of American Statistical Association,70, 631-639.
  • Lan, M. C. (2014). Exploring gender differential item functioning (DIF) in eight grade mathematics items for the United States and Taiwan. Unpublished doctoral disstertation.University of Washington.
  • Le, L. T. (2009). Investigating gender differential item functioning across countries and test languages for PISA science items. International Journal of Testing, 9(2), 122-133. http://dx.doi.org/10.1080/15305050902880769
  • Levine, D. W., Bowen, D. J., Kaplan, R. M., Kripke, D. F., Naughton, M. J., & Shumaker, S. A. (2003). Factor structure and measurement invariance of the women’s health initiative ınsomnia rating scale.Psychological Assessment,15, 123-136.
  • Logan S.& Johnstone, R. (2009). Gender differences in reading ability and attitudes: examining where these differences lie. Journal of Research in Reading, 32(2), 199-214 doi: 10.1111/j.1467-9817.2008.01389.x
  • Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale, NJ:Lawrence Erlbaum.
  • Lyons-Thomas, J., Sandilands, D., D. & Ercikan, K. (2014). Gender differential item functioning in mathematics in four ınternational jurisdictions. Education and Science, 39 (172), 20-32.
  • MacIntosh, R., & Hashim, S. (2003). Variance estimation for converting MIMIC model parameters to IRT parameters in DIF analysis. Applied Psychological Measurement, 27, 372-379.
  • Mellenbergh, J. G. (1989). Item bias and item response theory. International Journal of Educational Research, 13(2),127-143.
  • Meredith, W., & Millsap, R. (1992). On the misuse of manifest variables in the detection of measurement bias. Psychometrika, 57(2), 289-311.
  • Millsap, R., & Meredith, W. (1992). Inferential conditions in the statistical detection of measurement bias. Applied Psychological Measurement, 16(4), 389-402.
  • Muthén, B. O. (1985). A method for studying the homogeneity of test items with respect to other relevant variables. Journal of Educational Statistics, 10, 121-132.
  • Muthén, B. O., Kao, C. F., & Burstein, L. (1991). Instructionally sensitive psychometrics: Application of a new IRT-based detection technique to mathematics achievement test items. Journal of Educational Measurement, 28(1), 1-22.
  • Muthén, L., K. & Muthén, B., O. (2010). M plus Statistical Analysis with Latent Variables User’s Guide, Sixth Edition. Los Angeles, CA: Muthén & Muthén.
  • Nolen, S. B. (2003). Learning environment, motivation, and achievement in high school science. Journal of Research in Science Teaching, 40(4), 347–368.
  • OECD (2015). PISA 2015 Technical report. OECD: http://www.oecd.org/pisa/data/2015-technical-report/
  • Oort, F. J. (1998). Simulation study of item bias detection with restricted factor analysis. Structural Equation Modeling, 5, 107-124.
  • O’Reilly, T., & McNamara, D. S. (2007). The impact of science knowledge, reading skill, and reading strategy knowledge on more traditional “high-stakes” measures of high school students’ science achievement. American Educational Research Journal, 44(1), 161–196.
  • NCES (2003). NAEP validity studies: An agenda for NEAP validity studies (Report No. 2003-07). Retrieved from https://nces.ed.gov/pubs2003/200307.pdf
  • Schmidt, W. H., Cogan, L. S., & McKnight, C. C. (2011). Equality of educational opportunity: Myth or reality in U.S. schooling?. American Educator 34(4), 12-19.
  • Shealy, R. T., & Stout, W. F. (1993). A model-biased standardization approach that separates true bias/DIF from group ability differences and detects test bias/DTF as well as item bias/DIF. Psychometrika, 58, 159-194.
  • Shih, C. L., & Wang, W. C. (2009). Differential item functioning detection using multiple indicators, multiple causes method with a pure short anchor. Applied Psychological Measurement, 33(3),184-199.
  • Sireci, S. G., & Swaminathan, H. (1996). Evaluating Translation Equivalence: So What's the Big DIF? Paper presented at the AERA, Ellenville, NY.
  • Swaminathan, H., & Rogers, H. J. (1990). Detecting differential item functioning using logistic regression procedures. Journal of Educational Measurement, 27, 361-370.
  • Thissen, D., Steinberg, L., & Wainer, H. (1988). Use of item response theory in the study of group differences in trace lines. In H. Wainer & H. Braun (Eds.), Test validity (pp. 147-169). Hillsdale, NJ: Lawrence Erlbaum.
  • Wainer, H., Sireci, S., & Thissen, D. (1991). Differential testlet functioning: Definitions and detection. Journal of Educational Measurement, 28, 197-219.
  • Walker, C. M., Zhang, B. & Surber, J. (2008) Using a multidimensional differential item functioning framework to determine if reading ability affects student performance in mathematics. Applied Measurement in Education, 21(2), 162-181, DOI:10.1080/08957340801926201
  • Wang, W. C., Shih, C. L., & Yang, C.C. (2009). The MIMIC method with scale purification procedure for detecting differential item functioning. Educational and Psychological Measurement, 69(5), 713-731.
  • Wang, W. C., & Su, Y. H. (2004a). Effects of average signed area between two item characteristic curves and test purification procedures on the DIF detection via the Mantel-Haenszel method. Applied Measurement in Education, 17, 113-144.
  • Wang, W.-C., & Su, Y.-H. (2004b). Factors influencing the Mantel and generalized Mantel-Haenszel methods for the assessment of differential item functioning in polytomous items. Applied Psychological Measurement, 28, 450-480.
  • Wang, W.C., & Yeh, Y.L. (2003). Effects of anchor item methods on differential item functioning detection with the likelihood ratio test. Applied Psychological Measurement, 27, 479-498.
  • Welch, C. J., & Miller, T. R. (1995). Assessing differential item functioning in direct writing assessments: Problems and an example. Journal of Educational Measurement, 32, 163-178.
  • White, K. R. (1982). The relation between socioeconomic status and academic achievement, Psychological Bulletin, 91(3), 461-481 Wright, B. D. & Stone, M. H. (1979). Best test design. Chicago: MESA Press.
  • Woods, C. M. (2009). Evaluation of MIMIC-model methods for DIF testing with comparison to two-group analysis. Multivariate Behavioral Research, 44(1), 1-27.
  • Woods, C., M., & Grimm, K. J. (2011). Testing for nonuniform differential ıtem functioning with multiple ındicator multiple cause models. Applied Psychological Measurement, 35(5) 339–361. http://dx.doi.org/10.1177/0146621611405984.
  • Wu, A. D., & Ercikan, K. (2006). Using multiple-variable matching to identify culturalsources of differential item functioning. International Journal of Testing, 6(3),287300.
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (dif):logistic regression modeling as a unitary framework for binary and likert-type(ordinal)item scores. Ottawa on Directorate of Human Resources Research and Evaluation, Department of National Defense.
There are 65 citations in total.

Details

Journal Section Articles
Authors

Nermin Kıbrıslıoğlu Uysal

Kübra Atalay Kabasakal

Publication Date December 28, 2017
Acceptance Date October 18, 2017
Published in Issue Year 2017 Volume: 8 Issue: 4

Cite

APA Kıbrıslıoğlu Uysal, N., & Atalay Kabasakal, K. (2017). The Effect of Background Variables on Gender Related Differential Item Functioning. Journal of Measurement and Evaluation in Education and Psychology, 8(4), 373-390. https://doi.org/10.21031/epod.333451