-
In this study, it was aimed to examine the inter-rater reliability of the same performance tasks scored by seven-raters with poster scoring rubrics, science-chemistry laboratory applications performance scoring rubrics, and open-ended achievement test scoring rubrics within the framework of the many facet Rasch model. The participants were 27 sixth grade gifted-students attending BİLSEM selected by criterion sampling from purposive sampling types and seven teachers serving as raters. Density Open-Ended Achievement Test, Poster Scoring rubrics and Science/Chemistry Laboratory Applications Process Performance Analytical Rubric were used to collect data of the study. In the study, FACETS was used for the MFRM, JASP-0.17.2.1 was used for model – data fit analyses and assumption tests. As a result, it was seen that individuals could be distinguished at the group level in terms of ability levels, the items differed in terms of difficulty levels and there was a difference between the raters in terms of severity/leniency. As a result, it was seen that individuals could be distinguished in terms of ability levels at the group level, the items differed in terms of difficulty levels and there was a difference between the raters in terms of severity/leniency except for the open-ended achievement test. It was seen that the scoring based on the three scoring tools distinguished the individuals in terms of ability levels and the items differed in terms of difficulty levels. In the scoring of the poster and experimental performance evaluation scoring tool, scorers numbered six and seven had scoring severity/leniency errors, while no scoring errors Halo effect in open-ended and doing experiment performance scoring were found in all scoring. There were no inconsistent scoring errors in the scoring of the open-ended achievement test, poster evaluation and experiment performances according to the scoring rubric of the density of irregularly shaped solids.
Keywords: Many facet Rasch measurement model, gifted students, rater effects
• Çalışmamız özgün ve günceldir. • Yararlandığımız bütün çalışmalara eksiksiz ve doğru bir biçimde atıfta bulunduk ve alıntıları APA 7’ye göre yaptık. • Yazarlar arasında herhangi bir çıkar çatışması bulunmamaktadır. • Ham veri talep edildiğinde verebiliriz. • Çalışmamızı sadece söz konusu derginize sunduk. Tüm bu bilgilerin doğru olduğunu beyan ederiz.
-
-
-
| Primary Language | English |
|---|---|
| Subjects | Testing, Assessment and Psychometrics (Other) |
| Journal Section | Research Article |
| Authors | |
| Project Number | - |
| Submission Date | November 14, 2024 |
| Acceptance Date | September 22, 2025 |
| Publication Date | December 31, 2025 |
| Published in Issue | Year 2025 Volume: 16 Issue: 4 |