Research Article

Collaborative problem-solving design in large-scale assessments: Shedding lights in sequential conversation-based measurement

Volume: 10 Number: Special Issue December 27, 2023
EN TR

Collaborative problem-solving design in large-scale assessments: Shedding lights in sequential conversation-based measurement

Abstract

Collaborative problem solving (CPS) is inherently an interactive, conjoint, dual-strand process that considers how a student reasons about a problem as well as how s/he interacts with others to regulate social processes and exchange information (OECD, 2013). Measuring CPS skills presents a challenge for obtaining consistent, accurate, and reliable scale across individuals and user populations. The Programme for International Student Assessment (PISA)’s 2015 cycle first introduced an assessment of CPS in international large-scale assessments in which computer-based conversational agents were adapted to represent team members with a range of skills and abilities. This study draws on measures of the CPS domain in PISA 2015 to address the challenges and solutions related to CPS item design and shed lights on sequential conversation-based measurement. Specifically, we present the process of CPS item design, the development of scoring rules through CPS conversation paths, and discuss the possible approaches to better estimate CPS beyond item response models.

Keywords

References

  1. Casner-Lotto, J., & Barrington, L. (2006). Are they really ready to work? Employers’ perspectives on the basic knowledge and applied skills of new entrants to the 21st century U.S. workforce. http://www.conference-board.org/pdf_free/BED-06-Workforce.pdf
  2. Cooke, N.J., Duchon, A., Gorman, J.C., Keyton, J., & Miller, A. (2012). Preface to the special section on methods for the analysis of communication. Human Factors: Journal of the Human Factors and Ergonomics Society, 54, 485–488.
  3. de Boeck, P., & Scalise, K. (2019). Collaborative problem solving: Processing actions, time, and performance. Frontiers in Psychology, 10, 1280.
  4. Dede, C. (2009). Immersive interfaces for engagement and learning. Science, 323(5910), 66-69.
  5. Dede, C. (2012). Interweaving assessments into immersive authentic simulations: Design strategies for diagnostic and instructional insights. Paper presented at the Invitational Research Symposium on Technology Enhanced Assessments. http://www.k12center.org/rsc/pdf/session4-dede-paper-tea2012.pdf
  6. Gao, Y., Cui, Y., Bulut, O., Zhai, X., & Chen, F. (2022). Examining adults’ web navigation patterns in multi-layered hypertext environments. Computers in Human Behavior, 129, 107142.
  7. Greiff, S., Wüstenberg, S., Csapo, B., Demetriou, A., Hautamäki, J., Graesser, A. C., & Martin, R. (2014). Domain-general problem-solving skills and education in the 21st century. Educational Research Review, 13, 74-83.
  8. Griffin, P., McGaw, B., & Care, E. (Eds.). (2012). Assessment and teaching of 21st century skills. Springer.

Details

Primary Language

English

Subjects

Measurement Theories and Applications in Education and Psychology

Journal Section

Research Article

Publication Date

December 27, 2023

Submission Date

December 20, 2023

Acceptance Date

December 24, 2023

Published in Issue

Year 2023 Volume: 10 Number: Special Issue

APA
He, Q. (2023). Collaborative problem-solving design in large-scale assessments: Shedding lights in sequential conversation-based measurement. International Journal of Assessment Tools in Education, 10(Special Issue), 194-207. https://doi.org/10.21449/ijate.1407315

Cited By

23823             23825             23824