Research Article

Examining Students' Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics

Volume: 13 Number: 1 March 29, 2022
EN

Examining Students' Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics

Abstract

The aim of this study was to examine whether the time spent answering science and mathematics items by Turkish students participating in the Trends in International Mathematics and Science Study (TIMSS) at the 8th grade level showed a significant difference according to their proficiency levels, self-confidence, and the item characteristics. This study was correlational research to explore the relationship between the variables discussed. A total of 577 students who participated in the TIMSS 2019 study at the 8th grade level in Turkey and answered the common 24 (11 mathematics and 13 science) items in Booklets 1 and 2 constituted the study participants. In the data analysis, the Kruskal Wallis-H test, Mann-Whitney U test, and Latent Class Analysis were used. As a result, it was determined that the type of item and cognitive level had a significant relation to the item response times of students. The students were found to spend more time on open-ended items than multiple-choice items. On the other hand, the time spent on items in the applying level was significantly higher than the knowledge level. However, there was no significant difference between the time spent answering items in the applying level and reasoning level. It was observed that if the students' confidence level in science was high, the rate of correct answers was high, and they answered the items in a short amount of time. Students who were somewhat self-confident in mathematics were more successful in difficult mathematics items and spent less time answering the items.

Keywords

References

  1. Altuner, F. (2019). Examining the relationship between item statistics and item response time [Master’s Thesis, Mersin University]. Retrieved from http://tez2.yok.gov.tr/
  2. Birgili, B. (2014). Open ended questions as an alternative to multiple choice: Dilemma in Turkish examination system [Master’s Thesis, Middle East Technical University]. Retrieved from http://tez2.yok.gov.tr/
  3. Davison, M. L., Semmes, R., Huang, L., & Close, C. N. (2012). On the reliability and validity of a numerical reasoning speed dimension derived from response times collected in computerized testing. Educational and Psychological Measurement, 72(2), 245-263. https://doi.org/10.1177/0013164411408412
  4. Fishbein, B., Foy, P., & Yin, L. (2021). TIMSS 2019 user guide for the international database. TIMSS & PIRLS International Study Center, Boston College.
  5. Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2012). How to design and evaluate research in education (8th ed.). McGraw-Hill.
  6. Goldhammer, F., & Klein Entink, R. H. (2011). Speed of reasoning and its relation to reasoning ability. Intelligence, 39(2-3), 108-119. https://doi.org/10.1016/j.intell.2011.02.001
  7. Goldhammer, F., Naumann, J., Stelter, A., Tóth, K., Rölke, H., & Klieme, E. (2014). The time on task effect in reading and problem solving is moderated by task difficulty and skill: Insights from a computer-based large-scale assessment. Journal of Educational Psychology, 106(3), 608-626. https://doi.org/10.1037/a0034716
  8. Halkitis, P. N., Jones, J. P., & Pradhan, J. (1996, April 8-12). Estimating testing time: The effects of ıtem characteristics on response latency [Paper presentation]. Annual meeting of the American Educational Research Association, New York.

Details

Primary Language

English

Subjects

-

Journal Section

Research Article

Publication Date

March 29, 2022

Submission Date

September 23, 2021

Acceptance Date

December 27, 2021

Published in Issue

Year 2022 Volume: 13 Number: 1

APA
Yalçın, S. (2022). Examining Students’ Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics. Journal of Measurement and Evaluation in Education and Psychology, 13(1), 23-39. https://doi.org/10.21031/epod.999545
AMA
1.Yalçın S. Examining Students’ Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics. JMEEP. 2022;13(1):23-39. doi:10.21031/epod.999545
Chicago
Yalçın, Seher. 2022. “Examining Students’ Item Response Times in ETIMSS According to Their Proficiency Levels, Self-Confidence and Item Characteristics”. Journal of Measurement and Evaluation in Education and Psychology 13 (1): 23-39. https://doi.org/10.21031/epod.999545.
EndNote
Yalçın S (March 1, 2022) Examining Students’ Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics. Journal of Measurement and Evaluation in Education and Psychology 13 1 23–39.
IEEE
[1]S. Yalçın, “Examining Students’ Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics”, JMEEP, vol. 13, no. 1, pp. 23–39, Mar. 2022, doi: 10.21031/epod.999545.
ISNAD
Yalçın, Seher. “Examining Students’ Item Response Times in ETIMSS According to Their Proficiency Levels, Self-Confidence and Item Characteristics”. Journal of Measurement and Evaluation in Education and Psychology 13/1 (March 1, 2022): 23-39. https://doi.org/10.21031/epod.999545.
JAMA
1.Yalçın S. Examining Students’ Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics. JMEEP. 2022;13:23–39.
MLA
Yalçın, Seher. “Examining Students’ Item Response Times in ETIMSS According to Their Proficiency Levels, Self-Confidence and Item Characteristics”. Journal of Measurement and Evaluation in Education and Psychology, vol. 13, no. 1, Mar. 2022, pp. 23-39, doi:10.21031/epod.999545.
Vancouver
1.Seher Yalçın. Examining Students’ Item Response Times in eTIMSS According to their Proficiency Levels, Self-Confidence and Item Characteristics. JMEEP. 2022 Mar. 1;13(1):23-39. doi:10.21031/epod.999545

Cited By