Research Article
BibTex RIS Cite

Değişen Madde Fonksiyonunun Belirlenmesine Örtük Sınıf Yaklaşımı: PISA 2015 Fen Örneklemi

Year 2020, Volume: 20 Issue: 88, 179 - 198, 20.07.2020

Abstract

Problem Durumu: Aynı yetenek düzeyinde farklı gruplarda yer alan bireylerin bir test maddesini doğru yanıtlama olasılıkları eşit olmalıdır. Eğer madde, gruplardan birine daha fazla avantaj sağlıyorsa maddenin yanlı olduğu düşünülür. Yanlı maddeler sistematik hata içerir, bu nedenle puanların geçerliğini düşürür. Aynı zamanda puanların gruplar arasında doğru bir şekilde karşılaştırılmasına tehdit oluşturur. PISA ve TIMSS gibi uluslararası sınavlar kültür, dil, sosyoekonomik düzey ya da cinsiyet gibi farklı gruplarda yer alabilen bireylere uygulanmaktadır. Bu demografik özellikler her ne kadar testle ölçülmek istenmese de bireyin performansına etki edebilir. Bu nedenle testler uygulanmadan önce madde yanlılığı açısından incelenmelidir. Yanlılığın ilk işareti maddenin aynı yetenek düzeyindeki iki grupta farklı fonksiyonlaşmasıdır. Değişen madde fonksiyonu (DMF), yanlı olabilecek maddelerin belirlenmesinde istatistiksel bir tekniktir. Bu yöntem cinsiyet ya da ülke gibi gözlenen gruplardan birini referans diğerini odak grup olarak belirlendikten sonra gruplar arasında madde parametrelerinin karşılaştırılmasına dayanır. Ancak gözlenen gruba dayalı yöntemlerde bazı sınırlılıklar bulunmaktadır. Bir gözlenen grubun (örneğin kızlar) içerisinde yer alan tüm bireyler aynı madde bakımından avantajlı ya da dezavantajlı sayılmaktadır. Oysa madde aynı grup içerisinde yer alan farklı bireyler için avantajlı ya da dezavantajlı olabilir. Bu varsayımın sebebi gözlenen grupların homojen grup olma düşüncesinde yatmaktadır. Aynı zamanda bu gözlenen grup DMF’nin kaynağı olarak yansıtılır. Varsayımın sağlanmasının düşük olmasına yönelik eleştiriler örtük sınıflara göre DMF belirlemenin, DMF kaynağını bulmada daha etkili olduğunu belirtmişlerdir. Yapılan çalışmalar DMF incelemede örtük sınıf yaklaşımının avantaj sunabileceğini, DMF kaynağını herhangi bir değişken setinden bağımsız olarak incelemeye fırsat vereceğini belirtmektedir.

Araştırmanın Amacı: Bu çalışmanın amacı örtük sınıfa ve madde tepki kuramı çerçevesinde yöntemlerden gözlenen grup yaklaşımıyla belirlenen DMF sonuçlarının karşılaştırmaktır.

Araştırmanın Yöntemi: Araştırmada farklı kültürden biraraya gelen bireylerin örtük sınıfları yansıtma oranının yüksek olması nedeniyle PISA 2015 uygulamasına katılan Singapur ve Türkiye örneklemleri kullanılmıştır. Bu çalışmada PISA bilişsel fen maddelerinden yalnızca ikili (1-0) şeklinde puanlananlar dikkate alınmıştır. Çalışmaya maddeleri ortak olarak işaretleyen Türkiye örnekleminden 498, Singapur’dan 614 öğrenci dahil edilmiştir. Örtük sınıfların belirlenmesinde Karma Madde Tepki Kuramı (KTMK) modelinden yararlanılmıştır. Bu analiz Winmira (2001) programında yapılmıştır. Örtük sınıflar arasında DMF karşılaştırmak üzere Mantel-Haenszel tekniği kullanılmıştır. Gözlenen gruplara Dayalı DMF’yi belirlemek üzere Lord’un ki-kare (χ^2) yöntemi ve Madde Tepki Kuramı Olabilirlik Oranı (MTK-OO) yönteminden yararlanılmıştır. Bu analizler ise R programında ‘difR’ kütüphanesinde gerçekleştirilmiştir.

References

  • Angoff, W. H. (1993). Perspective on differential item functioning methodology. In P. W. Holland &H. Wainer (Eds.), Differential item functioning (pp. 3–24). Hillsdale, NJ: Erlbaum.
  • Arikan, A. C. (2015). Comparison of Likelihood Ratio Test (LRT), Poly-SIBTEST and Logistic Regression in Differential Item Functioning (DIF) Detection Procedures. e-International Journal of Educational Research, 6(1), 1-16.
  • Asil, M. & Gelbal, S. (2012). Cross-cultural Equivalence of the PISA Student Questionnaire. Education and Science, 37, 236-249.
  • Bilir, M. K. (2009). Mixture Item Response Theory-Mimic Model: Simultaneous Estimation of Differential Item Functioning For Manifest Groups and Latent Classes. Doctoral Dissertation. Florida State University
  • Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. A. Bollen & J. S. Long (Eds.). Testing structural equation models (pp. 136-162). Newsbury Park, CA: Sage.
  • Camilli, G., & Shepard, L. A. (1994). MMSS: Methods for identifying biased test items. Thousand Oaks, CA: Sage.
  • Cho, S. J., & Cohen, A. S. (2010). A multilevel mixture IRT model with an application to DIF. Journal of Educational and Behavioral Statistics, 35(3), 336-370. doi: 10.3102/1076998609353111.
  • Choi, Y., Alexeev, N., & Cohen, A. S. (2015). Differential item functioning analysis using a mixture 3-parameter logistic model with a covariate on the TIMSS 2007 mathematics test. International Journal of Testing, 15(3), 239-253. doi: 10.1080/15305058.2015.1007241.
  • Cho, S. J. (2007). A multilevel mixture IRT model for DIF analysis. Unpublished doctoral dissertation, University of Georgia: Athens.
  • Cho, S. J., Suh, Y., & Lee, W. Y. (2016). An NCME instructional module on latent DIF analysis using mixture item response models. Educational Measurement: Issues and Practice, 35(1), 48-61. https://doi.org/10.1111/emip.12093
  • Clauser, B. E., & Mazor, K. M. (1998). Using statistical procedures to identify differentially functioning test items. Educational Measurement: Issues and Practice, 17(1), 31-44. doi: 10.1111/j.1745-3992.1998.tb00619.x
  • Cohen, A. S., & Bolt, D. M. (2005). A mixture model analysis of differential item functioning. Journal of Educational Measurement, 42(2), 133–148. doi: 10.1111/j.1745-3984.2005.00007.
  • De Ayala, R. J., Kim, S. H., Stapleton, L. M., & Dayton, C. M. (2002). Differential item functioning: A mixture distribution conceptualization. International Journal of Testing, 2(3-4), 243-276. doi: 10.1080/15305058.2002.9669495.
  • De Mars, C. E., & Lau, A. (2011). Differential item functioning detection with latent classes: How accurately can we detect who is responding differentially? Educational and Psychological Measurement, 71(4), 597–616. doi: 10.1177/0013164411404221.
  • Dorans, N. J., & Holland, P. W. (1993). DIF detection and description: Mantel haenszel and standardization. In P. W. Holland, and H. Wainer, (Eds.), Differential item functioning (p. 35–66), New Jersey: USA.
  • Finch, W. H. & Finch, M. E. H. (2013). Investigation of specific learning disability and testing accommodations based differential item functioning using a multilevel multidimensional mixture item response theory model. Educational and Psychological Measurement, 73(6) 973–993. doi: 10.1177/0013164413494776.
  • Gierl, M. J., Gotzmann, A., & Boughton, K. A. (2004). Performance of SIBTEST when the percentage of DIF items is large. Applied Measurement in Education, 17(3), 241-264. doi: https://doi.org/10.1207/s15324818ame1703_2
  • Hambleton, R. K., & Swaminathan, H. (1985). Item response theory: Principles and application. Boston, MA: Kluwer Academic Publishers Group.
  • Holland, P. W., & Thayer, D. T. (1988). Differential item performance and the Mantel-Haenszel procedure. In H. Wainer & H. I. Braun (Eds.), Test validity (pp. 129-145). Hillsdale, NJ: Erlbaum.
  • Hu, L. T., & Bentler, P. M. (1995). Evaluating model fit. In R. H. Hoyle (Ed.), Structural equation modeling: Concepts, issues, and applications (pp. 76-99). Thousand Oaks, CA: Sage.
  • Jöreskog, K.G. ve Sörbom, D. (1993). Lisrel 8: Structural equation modeling with the SIMPLES comman language. Lincolnwood, IL: Scientific Software International.
  • Kabasakal, K. A., Gök, B., Kelecioğlu, H., & Arsan, N. (2012). Degisen madde fonksiyonunun belirlemesinde kullanilan farkli yontemlerin karsilastirilmasi: bir simulasyon calismasi. Hacettepe University Journal of Education, 43(43), 270-281. Retrieved from https://dergipark.org.tr/tr/pub/hunefd/issue/7795/102030
  • Karadavut, T. (2017). DIF analysis with manifest and latent groups: Analysis of PISA 2012 mathematics data from Turkey. The Eurasia Proceedings of Educational & Social Sciences, 8, 103-106. Retrieved from: http://static.dergipark.org.tr/article download/cf1d/c744/dd82/5a341ef0af7f6.pdf?
  • Kelderman, H., & Macready, G. B. (1990). The use of loglinear models for assessing differential item functioning across manifest and latent examinee groups. Journal of Educational Measurement, 27(4), 307–327. https://doi.org/10.1111/j.1745-3984.1990.tb00751.x
  • Levent, F. & Yazici, E. (2014). Singapur egitim sisteminin basarisina etki eden faktorlerin incelenmesi. Journal of Educational Sciences, 39, 121-143. doi: 10.15285/EBD.2014397401
  • Magis, D. (2018). Collection of methods to detect dichotomous differential item functioning (DIF). Package ‘difR’.
  • Maij-de Meij, A. M., Kelderman, H. & van der Flier, H. (2010). Improvement in detection of differential item functioning using a mixture item response theory model. Multivariate Behavioral Research, 45(6), 975-999. doi:10.1080/00273171.2010.533047.
  • Mellenberg, G. J. (1982). Contingency table models for assessing item bias. Journal of Educational Statistics, 7(2), 105-118.
  • Millsap, R.E. & Everson, H.T. (1993). Methodology Review: Statistical Approaches for Assessing Measurement Bias. Applied Psychological Measurement, 17(4), 297–334.
  • Muthén, L. K., & Muthén, B. O. (2012). Mplus user’s guide (Eighth Edition). Los Angeles, CA: Muthén & Muthén.
  • Oliveri, M. E., Ercikan, K., & Zumbo, B. (2013). Analysis of sources of latent class differential item functioning in international assessments. International Journal of Testing, 13(3), 272-293.
  • Penfield, R. D. (2007). An approach for categorizing DIF in polytomous items. Applied Measurement in Education, 20(3), 335-355.
  • Rost, J. (1990). Rasch models in latent classes: An integration of two approaches to item analysis. Applied Psychological Measurement, 14, 271–282.
  • Roussos, L. A., Schnipke, D. L., & Pashley, P. J. (1999). A generalized formula for the Mantel-Haenszel differential item functioning parameter. Journal of Educational and Behavioral Statistics, 24(3), 293-322.
  • Samuelsen, K. M. (2005). Examining differential item functioning from a latent class perspective. Unpublished doctoral dissertation, University of Maryland, College Park.
  • Sawatzky, R., Ratner, P. A., Kopec, J. A., & Zumbo, B. D. (2012). Latent variable mixture models: A promising approach for the validation of patient reported outcomes. Quality of Life Research, 21(4), 637-650.
  • Sunbul, S. O. & Sunbul, O. (2016). Değişen madde fonksiyonunun belirlenmesinde kullanılan yöntemlerde I. Tip hata ve güç çalışması. Elementary Education Online, 15(3), 882-897.
  • Uyar, S., Kelecioglu, H., & Dogan, N. (2017). Comparing differential item functioning based on manifest groups and latent classes. Educational Sciences: Theory & Practice, 17(6), 1977–2000. doi: 10.12738/estp.2017.6.0526.
  • Van Nijlen, D., & Janssen, R. (2008). Mixture IRT-models as a means of DIF-detection: Modelling spelling in different grades of primary school. In Annual Meeting of the National Council on Measurement in Education, Date: 2008/01/01-2008/01/01, Location: New York.
  • Von davier, M. (2001). WINMIRA 2001: Software for estimating Rasch models, mixed and hybrid Rasch models and latent class analysis [Computer software]. Retrieved from: http://www.von-davier.com/
  • Wu, A. D., & Ercikan, K. (2006). Using multiple-variable matching to identify cultural sources of differential item functioning. International Journal of Testing, 6(3), 287-300.
  • Yalcin, S. (2018). Determining differentisl item functioning with the mixture item response theory. Eurasian Journal of Educational Research, 74, 187-206.
  • Yuksel, S. (2012). Analyzing differential item functioning by mixed rasch models which stated in scales. Yayımlanmamış Doktora tezi. Ankara University Graduate School of Health Sciences, Ankara.
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic Regression modeling as a unitary framework for binary and likert-type (ordinal) item scores. Ottowa, ON: Directorate of Human Resources Research and Evaluation, Department of National Defense.

Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample

Year 2020, Volume: 20 Issue: 88, 179 - 198, 20.07.2020

Abstract

Purpose: This study aimed to compare the performance of latent class differential item functioning (DIF) approach and IRT based DIF methods using manifest grouping. With this study, it was thought to draw attention to carry out latent class DIF studies in Turkey. The purpose of this study was to examine DIF in PISA 2015 science data set.

Research Methods: Only dichotomous items were considered in this study. Turkey and Singapore samples were used to examine DIF. There were 6115 students in Singapore data set and 5895 students in Turkey sample. To detect DIF among countries based on manifest grouping, Item Response Theory Likelihood Ratio (IRT-LR) and Lord’s Chi-Square techniques were used. Besides, with Mixture Item Response Theory latent classes were defined and DIF items were detected with Mantel Haenszel method (MH) among latent classes. Number of DIF items were detected according to latent classes and the two countries were compared.

Findings: There were 8 items including DIF among latent classes. With Lord’s Chi square method, four items were detected to include DIF at medium and high level among Turkey and Singapore. And IRT-LR method revealed that only two items included DIF among countries.

Implications for Research and Practice: According to the results, it was recommended to use latent class approach in the investigation of DIF items in cross-country studies.

References

  • Angoff, W. H. (1993). Perspective on differential item functioning methodology. In P. W. Holland &H. Wainer (Eds.), Differential item functioning (pp. 3–24). Hillsdale, NJ: Erlbaum.
  • Arikan, A. C. (2015). Comparison of Likelihood Ratio Test (LRT), Poly-SIBTEST and Logistic Regression in Differential Item Functioning (DIF) Detection Procedures. e-International Journal of Educational Research, 6(1), 1-16.
  • Asil, M. & Gelbal, S. (2012). Cross-cultural Equivalence of the PISA Student Questionnaire. Education and Science, 37, 236-249.
  • Bilir, M. K. (2009). Mixture Item Response Theory-Mimic Model: Simultaneous Estimation of Differential Item Functioning For Manifest Groups and Latent Classes. Doctoral Dissertation. Florida State University
  • Browne, M. W., & Cudeck, R. (1993). Alternative ways of assessing model fit. In K. A. Bollen & J. S. Long (Eds.). Testing structural equation models (pp. 136-162). Newsbury Park, CA: Sage.
  • Camilli, G., & Shepard, L. A. (1994). MMSS: Methods for identifying biased test items. Thousand Oaks, CA: Sage.
  • Cho, S. J., & Cohen, A. S. (2010). A multilevel mixture IRT model with an application to DIF. Journal of Educational and Behavioral Statistics, 35(3), 336-370. doi: 10.3102/1076998609353111.
  • Choi, Y., Alexeev, N., & Cohen, A. S. (2015). Differential item functioning analysis using a mixture 3-parameter logistic model with a covariate on the TIMSS 2007 mathematics test. International Journal of Testing, 15(3), 239-253. doi: 10.1080/15305058.2015.1007241.
  • Cho, S. J. (2007). A multilevel mixture IRT model for DIF analysis. Unpublished doctoral dissertation, University of Georgia: Athens.
  • Cho, S. J., Suh, Y., & Lee, W. Y. (2016). An NCME instructional module on latent DIF analysis using mixture item response models. Educational Measurement: Issues and Practice, 35(1), 48-61. https://doi.org/10.1111/emip.12093
  • Clauser, B. E., & Mazor, K. M. (1998). Using statistical procedures to identify differentially functioning test items. Educational Measurement: Issues and Practice, 17(1), 31-44. doi: 10.1111/j.1745-3992.1998.tb00619.x
  • Cohen, A. S., & Bolt, D. M. (2005). A mixture model analysis of differential item functioning. Journal of Educational Measurement, 42(2), 133–148. doi: 10.1111/j.1745-3984.2005.00007.
  • De Ayala, R. J., Kim, S. H., Stapleton, L. M., & Dayton, C. M. (2002). Differential item functioning: A mixture distribution conceptualization. International Journal of Testing, 2(3-4), 243-276. doi: 10.1080/15305058.2002.9669495.
  • De Mars, C. E., & Lau, A. (2011). Differential item functioning detection with latent classes: How accurately can we detect who is responding differentially? Educational and Psychological Measurement, 71(4), 597–616. doi: 10.1177/0013164411404221.
  • Dorans, N. J., & Holland, P. W. (1993). DIF detection and description: Mantel haenszel and standardization. In P. W. Holland, and H. Wainer, (Eds.), Differential item functioning (p. 35–66), New Jersey: USA.
  • Finch, W. H. & Finch, M. E. H. (2013). Investigation of specific learning disability and testing accommodations based differential item functioning using a multilevel multidimensional mixture item response theory model. Educational and Psychological Measurement, 73(6) 973–993. doi: 10.1177/0013164413494776.
  • Gierl, M. J., Gotzmann, A., & Boughton, K. A. (2004). Performance of SIBTEST when the percentage of DIF items is large. Applied Measurement in Education, 17(3), 241-264. doi: https://doi.org/10.1207/s15324818ame1703_2
  • Hambleton, R. K., & Swaminathan, H. (1985). Item response theory: Principles and application. Boston, MA: Kluwer Academic Publishers Group.
  • Holland, P. W., & Thayer, D. T. (1988). Differential item performance and the Mantel-Haenszel procedure. In H. Wainer & H. I. Braun (Eds.), Test validity (pp. 129-145). Hillsdale, NJ: Erlbaum.
  • Hu, L. T., & Bentler, P. M. (1995). Evaluating model fit. In R. H. Hoyle (Ed.), Structural equation modeling: Concepts, issues, and applications (pp. 76-99). Thousand Oaks, CA: Sage.
  • Jöreskog, K.G. ve Sörbom, D. (1993). Lisrel 8: Structural equation modeling with the SIMPLES comman language. Lincolnwood, IL: Scientific Software International.
  • Kabasakal, K. A., Gök, B., Kelecioğlu, H., & Arsan, N. (2012). Degisen madde fonksiyonunun belirlemesinde kullanilan farkli yontemlerin karsilastirilmasi: bir simulasyon calismasi. Hacettepe University Journal of Education, 43(43), 270-281. Retrieved from https://dergipark.org.tr/tr/pub/hunefd/issue/7795/102030
  • Karadavut, T. (2017). DIF analysis with manifest and latent groups: Analysis of PISA 2012 mathematics data from Turkey. The Eurasia Proceedings of Educational & Social Sciences, 8, 103-106. Retrieved from: http://static.dergipark.org.tr/article download/cf1d/c744/dd82/5a341ef0af7f6.pdf?
  • Kelderman, H., & Macready, G. B. (1990). The use of loglinear models for assessing differential item functioning across manifest and latent examinee groups. Journal of Educational Measurement, 27(4), 307–327. https://doi.org/10.1111/j.1745-3984.1990.tb00751.x
  • Levent, F. & Yazici, E. (2014). Singapur egitim sisteminin basarisina etki eden faktorlerin incelenmesi. Journal of Educational Sciences, 39, 121-143. doi: 10.15285/EBD.2014397401
  • Magis, D. (2018). Collection of methods to detect dichotomous differential item functioning (DIF). Package ‘difR’.
  • Maij-de Meij, A. M., Kelderman, H. & van der Flier, H. (2010). Improvement in detection of differential item functioning using a mixture item response theory model. Multivariate Behavioral Research, 45(6), 975-999. doi:10.1080/00273171.2010.533047.
  • Mellenberg, G. J. (1982). Contingency table models for assessing item bias. Journal of Educational Statistics, 7(2), 105-118.
  • Millsap, R.E. & Everson, H.T. (1993). Methodology Review: Statistical Approaches for Assessing Measurement Bias. Applied Psychological Measurement, 17(4), 297–334.
  • Muthén, L. K., & Muthén, B. O. (2012). Mplus user’s guide (Eighth Edition). Los Angeles, CA: Muthén & Muthén.
  • Oliveri, M. E., Ercikan, K., & Zumbo, B. (2013). Analysis of sources of latent class differential item functioning in international assessments. International Journal of Testing, 13(3), 272-293.
  • Penfield, R. D. (2007). An approach for categorizing DIF in polytomous items. Applied Measurement in Education, 20(3), 335-355.
  • Rost, J. (1990). Rasch models in latent classes: An integration of two approaches to item analysis. Applied Psychological Measurement, 14, 271–282.
  • Roussos, L. A., Schnipke, D. L., & Pashley, P. J. (1999). A generalized formula for the Mantel-Haenszel differential item functioning parameter. Journal of Educational and Behavioral Statistics, 24(3), 293-322.
  • Samuelsen, K. M. (2005). Examining differential item functioning from a latent class perspective. Unpublished doctoral dissertation, University of Maryland, College Park.
  • Sawatzky, R., Ratner, P. A., Kopec, J. A., & Zumbo, B. D. (2012). Latent variable mixture models: A promising approach for the validation of patient reported outcomes. Quality of Life Research, 21(4), 637-650.
  • Sunbul, S. O. & Sunbul, O. (2016). Değişen madde fonksiyonunun belirlenmesinde kullanılan yöntemlerde I. Tip hata ve güç çalışması. Elementary Education Online, 15(3), 882-897.
  • Uyar, S., Kelecioglu, H., & Dogan, N. (2017). Comparing differential item functioning based on manifest groups and latent classes. Educational Sciences: Theory & Practice, 17(6), 1977–2000. doi: 10.12738/estp.2017.6.0526.
  • Van Nijlen, D., & Janssen, R. (2008). Mixture IRT-models as a means of DIF-detection: Modelling spelling in different grades of primary school. In Annual Meeting of the National Council on Measurement in Education, Date: 2008/01/01-2008/01/01, Location: New York.
  • Von davier, M. (2001). WINMIRA 2001: Software for estimating Rasch models, mixed and hybrid Rasch models and latent class analysis [Computer software]. Retrieved from: http://www.von-davier.com/
  • Wu, A. D., & Ercikan, K. (2006). Using multiple-variable matching to identify cultural sources of differential item functioning. International Journal of Testing, 6(3), 287-300.
  • Yalcin, S. (2018). Determining differentisl item functioning with the mixture item response theory. Eurasian Journal of Educational Research, 74, 187-206.
  • Yuksel, S. (2012). Analyzing differential item functioning by mixed rasch models which stated in scales. Yayımlanmamış Doktora tezi. Ankara University Graduate School of Health Sciences, Ankara.
  • Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic Regression modeling as a unitary framework for binary and likert-type (ordinal) item scores. Ottowa, ON: Directorate of Human Resources Research and Evaluation, Department of National Defense.
There are 44 citations in total.

Details

Primary Language English
Journal Section Articles
Authors

Seyma Uyar This is me 0000-0002-8315-2637

Publication Date July 20, 2020
Published in Issue Year 2020 Volume: 20 Issue: 88

Cite

APA Uyar, S. (2020). Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample. Eurasian Journal of Educational Research, 20(88), 179-198.
AMA Uyar S. Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample. Eurasian Journal of Educational Research. July 2020;20(88):179-198.
Chicago Uyar, Seyma. “Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample”. Eurasian Journal of Educational Research 20, no. 88 (July 2020): 179-98.
EndNote Uyar S (July 1, 2020) Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample. Eurasian Journal of Educational Research 20 88 179–198.
IEEE S. Uyar, “Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample”, Eurasian Journal of Educational Research, vol. 20, no. 88, pp. 179–198, 2020.
ISNAD Uyar, Seyma. “Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample”. Eurasian Journal of Educational Research 20/88 (July 2020), 179-198.
JAMA Uyar S. Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample. Eurasian Journal of Educational Research. 2020;20:179–198.
MLA Uyar, Seyma. “Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample”. Eurasian Journal of Educational Research, vol. 20, no. 88, 2020, pp. 179-98.
Vancouver Uyar S. Latent Class Approach to Detect Differential Item Functioning: PISA 2015 Science Sample. Eurasian Journal of Educational Research. 2020;20(88):179-98.