Araştırma Makalesi
BibTex RIS Kaynak Göster

Collaborative problem-solving design in large-scale assessments: Shedding lights in sequential conversation-based measurement

Yıl 2023, Cilt: 10 Sayı: Special Issue, 194 - 207, 27.12.2023
https://doi.org/10.21449/ijate.1407315

Öz

Collaborative problem solving (CPS) is inherently an interactive, conjoint, dual-strand process that considers how a student reasons about a problem as well as how s/he interacts with others to regulate social processes and exchange information (OECD, 2013). Measuring CPS skills presents a challenge for obtaining consistent, accurate, and reliable scale across individuals and user populations. The Programme for International Student Assessment (PISA)’s 2015 cycle first introduced an assessment of CPS in international large-scale assessments in which computer-based conversational agents were adapted to represent team members with a range of skills and abilities. This study draws on measures of the CPS domain in PISA 2015 to address the challenges and solutions related to CPS item design and shed lights on sequential conversation-based measurement. Specifically, we present the process of CPS item design, the development of scoring rules through CPS conversation paths, and discuss the possible approaches to better estimate CPS beyond item response models.

Kaynakça

  • Casner-Lotto, J., & Barrington, L. (2006). Are they really ready to work? Employers’ perspectives on the basic knowledge and applied skills of new entrants to the 21st century U.S. workforce. http://www.conference-board.org/pdf_free/BED-06-Workforce.pdf
  • Cooke, N.J., Duchon, A., Gorman, J.C., Keyton, J., & Miller, A. (2012). Preface to the special section on methods for the analysis of communication. Human Factors: Journal of the Human Factors and Ergonomics Society, 54, 485–488.
  • de Boeck, P., & Scalise, K. (2019). Collaborative problem solving: Processing actions, time, and performance. Frontiers in Psychology, 10, 1280.
  • Dede, C. (2009). Immersive interfaces for engagement and learning. Science, 323(5910), 66-69.
  • Dede, C. (2012). Interweaving assessments into immersive authentic simulations: Design strategies for diagnostic and instructional insights. Paper presented at the Invitational Research Symposium on Technology Enhanced Assessments. http://www.k12center.org/rsc/pdf/session4-dede-paper-tea2012.pdf
  • Gao, Y., Cui, Y., Bulut, O., Zhai, X., & Chen, F. (2022). Examining adults’ web navigation patterns in multi-layered hypertext environments. Computers in Human Behavior, 129, 107142.
  • Greiff, S., Wüstenberg, S., Csapo, B., Demetriou, A., Hautamäki, J., Graesser, A. C., & Martin, R. (2014). Domain-general problem-solving skills and education in the 21st century. Educational Research Review, 13, 74-83.
  • Griffin, P., McGaw, B., & Care, E. (Eds.). (2012). Assessment and teaching of 21st century skills. Springer.
  • Han, Z., He, Q., & von Davier, M. (2019). Predictive feature generation and selection using process data from PISA interactive problem-solving items: An application of random forests. Frontiers in Psychology, 10, 1421.
  • Han, A., Krieger, F., Borgonovi, F., & Greiff, S. (2023). Behavioral patterns in collaborative problem solving: a latent profile analysis based on response times and actions in PISA 2015. Large-scale Assessments in Education, 11(1), 35.
  • He, Q., Borgonovi, F., & Paccagnella, M. (2021). Leveraging process data to assess adults' problem-solving skills: Identifying generalized behavioral patterns with sequence mining. Computers & Education, 166, 104170.
  • He, Q., Borgonovi, F., Suárez-Álvarez, J. (2023a). Clustering Sequential Navigation Patterns in Multiple-Source Reading Tasks with Dynamic Time Warping Method. Journal of Computer-Assisted Learning, 39(3), 719-736.
  • He, Q., Shi, Q., Tighe, E. (2023b). Predicting problem-solving proficiency with hierarchical supervised models on response process. Psychological Test and Assessment Modeling, 65(1), 145-178.
  • He, Q., & von Davier, M. (2016). Analyzing process data from problem-solving items with n grams: Insights from a computer-based large-scale assessment. In Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Handbook of research on technology tools for real-world skill development (pp. 749-776). Information Science Reference.
  • He, Q., von Davier, M., Greiff, S., Steinhauer, E.W., & Borysewicz, P.B. (2017). Collaborative problem-solving measures in the Programme for International Student Assessment (PISA). In A.A. von Davier, M. Zhu, & P.C. Kyllonen, (Eds.), Innovative assessment of collaboration (pp. 95-111). Springer.
  • Hirschberg, D.S. (1975). A linear space algorithm for computing maximal common subsequences. Communications of the ACM, 18, 341-343.
  • Hirschberg, D.S. (1977). Algorithms for the longest common subsequence problem. Journal of the ACM, 24(4), 664-675.
  • Lord, F.M. (1980). Applications of item response theory to practical testing problems. Erlbaum.
  • National Academies. (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_070895.pdf
  • National Center for Education Statistics (2015). The nation’s report card: 2015 mathematics and reading assessments. Publication No. NCES 2015136. Washington, DC: Author.
  • OpenAI. (2023). ChatGPT (May 24 version) [Large language model]. https://chat.openai.com/chat/
  • Organisation for Economic Co-operation and Development (2013). PISA 2015: Draft collaborative problem solving framework. Paris, France: Author.
  • Organisation for Economic Co-operation and Development (2015a). PISA 2015 released field trial cognitive items. Paris, France: Author.
  • Organisation for Economic Co-operation and Development (2015b). PISA 2015 field trial analysis report: Outcomes of the cognitive assessment (JT03371930). Paris, France: Author.
  • Organisation for Economic Co-operation and Development (2017). PISA 2015 Results (Volume V): Collaborative Problem Solving, PISA, OECD Publishing, Paris, France.
  • Qiao, X., Jiao, H. & He, Q. (2023). Multiple-Group Joint Modeling of Item Responses, Response Times, and Action Counts with the Conway-Maxwell-Poisson Distribution. Journal of Educational Measurement, 60(2), 255-281.
  • Quellmalz, E.S., Timms, M.J., & Schneider, S.A. (2009). Assessment of student learning in science simulations and games. Paper prepared for the National Research Council Workshop on Gaming and Simulations, Washington, DC.
  • Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests. Copenhagen, Denmark: Danish Institute for Educational Research.
  • Rosenbaum, P.R. (1988). Item bundles. Psychometrika, 53(3), 349-359.
  • Stecher, B.M., & Hamilton, L.S. (2014). Measuring hard-to-measure student competencies: A research and development plan, Research Report. RAND Corporation.
  • Ulitzsch, E., He, Q., Ulitzsch, V., Nichterlein, A., Molter, H., Niedermeier, R., & Pohl, S. (2021). Combining clickstream analyses and graph-modeled data clustering for identifying common response processes. Psychometrika, 86, 190-214.
  • Ulitzsch, E., Ulitzsch, V., He, Q., & Lüdtke, O. (2023). A machine learning-based procedure for leveraging clickstream data to investigate early predictability of failure on interactive tasks. Behavior Research Methods, 55, 1392–1412.
  • von Davier, M., Khorramdel, L., He, Q., Shin, H., & Chen, H. (2019). Developments in psychometric population models for data from innovative items. Journal of Educational and Behavioral Statistics, 44(6), 671-705.
  • Wildman, J.L., Thayer, A.L., Pavlas, D., Salas, E., Stewart, J.E., & Howse, W. (2012). Team knowledge research: Emerging trends and critical needs. Human Factors: The Journal of the Human Factors and Ergonomics Society, 54, 84-111.
  • Wilson, M., & Adams, R.J. (1995). Rasch models for item bundles. Psychometrika, 60(2), 181-198.
  • Xiao, Y., He, Q., Veldkamp, B.P., & Liu, H. (2021). Exploring Latent States of Problem-Solving Competence Using Hidden Markov Modeling on Process Data. Journal of Computer-Assisted Learning, 37(5), 1232-1247.

Collaborative problem-solving design in large-scale assessments: Shedding lights in sequential conversation-based measurement

Yıl 2023, Cilt: 10 Sayı: Special Issue, 194 - 207, 27.12.2023
https://doi.org/10.21449/ijate.1407315

Öz

Collaborative problem solving (CPS) is inherently an interactive, conjoint, dual-strand process that considers how a student reasons about a problem as well as how s/he interacts with others to regulate social processes and exchange information (OECD, 2013). Measuring CPS skills presents a challenge for obtaining consistent, accurate, and reliable scale across individuals and user populations. The Programme for International Student Assessment (PISA)’s 2015 cycle first introduced an assessment of CPS in international large-scale assessments in which computer-based conversational agents were adapted to represent team members with a range of skills and abilities. This study draws on measures of the CPS domain in PISA 2015 to address the challenges and solutions related to CPS item design and shed lights on sequential conversation-based measurement. Specifically, we present the process of CPS item design, the development of scoring rules through CPS conversation paths, and discuss the possible approaches to better estimate CPS beyond item response models.

Kaynakça

  • Casner-Lotto, J., & Barrington, L. (2006). Are they really ready to work? Employers’ perspectives on the basic knowledge and applied skills of new entrants to the 21st century U.S. workforce. http://www.conference-board.org/pdf_free/BED-06-Workforce.pdf
  • Cooke, N.J., Duchon, A., Gorman, J.C., Keyton, J., & Miller, A. (2012). Preface to the special section on methods for the analysis of communication. Human Factors: Journal of the Human Factors and Ergonomics Society, 54, 485–488.
  • de Boeck, P., & Scalise, K. (2019). Collaborative problem solving: Processing actions, time, and performance. Frontiers in Psychology, 10, 1280.
  • Dede, C. (2009). Immersive interfaces for engagement and learning. Science, 323(5910), 66-69.
  • Dede, C. (2012). Interweaving assessments into immersive authentic simulations: Design strategies for diagnostic and instructional insights. Paper presented at the Invitational Research Symposium on Technology Enhanced Assessments. http://www.k12center.org/rsc/pdf/session4-dede-paper-tea2012.pdf
  • Gao, Y., Cui, Y., Bulut, O., Zhai, X., & Chen, F. (2022). Examining adults’ web navigation patterns in multi-layered hypertext environments. Computers in Human Behavior, 129, 107142.
  • Greiff, S., Wüstenberg, S., Csapo, B., Demetriou, A., Hautamäki, J., Graesser, A. C., & Martin, R. (2014). Domain-general problem-solving skills and education in the 21st century. Educational Research Review, 13, 74-83.
  • Griffin, P., McGaw, B., & Care, E. (Eds.). (2012). Assessment and teaching of 21st century skills. Springer.
  • Han, Z., He, Q., & von Davier, M. (2019). Predictive feature generation and selection using process data from PISA interactive problem-solving items: An application of random forests. Frontiers in Psychology, 10, 1421.
  • Han, A., Krieger, F., Borgonovi, F., & Greiff, S. (2023). Behavioral patterns in collaborative problem solving: a latent profile analysis based on response times and actions in PISA 2015. Large-scale Assessments in Education, 11(1), 35.
  • He, Q., Borgonovi, F., & Paccagnella, M. (2021). Leveraging process data to assess adults' problem-solving skills: Identifying generalized behavioral patterns with sequence mining. Computers & Education, 166, 104170.
  • He, Q., Borgonovi, F., Suárez-Álvarez, J. (2023a). Clustering Sequential Navigation Patterns in Multiple-Source Reading Tasks with Dynamic Time Warping Method. Journal of Computer-Assisted Learning, 39(3), 719-736.
  • He, Q., Shi, Q., Tighe, E. (2023b). Predicting problem-solving proficiency with hierarchical supervised models on response process. Psychological Test and Assessment Modeling, 65(1), 145-178.
  • He, Q., & von Davier, M. (2016). Analyzing process data from problem-solving items with n grams: Insights from a computer-based large-scale assessment. In Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Handbook of research on technology tools for real-world skill development (pp. 749-776). Information Science Reference.
  • He, Q., von Davier, M., Greiff, S., Steinhauer, E.W., & Borysewicz, P.B. (2017). Collaborative problem-solving measures in the Programme for International Student Assessment (PISA). In A.A. von Davier, M. Zhu, & P.C. Kyllonen, (Eds.), Innovative assessment of collaboration (pp. 95-111). Springer.
  • Hirschberg, D.S. (1975). A linear space algorithm for computing maximal common subsequences. Communications of the ACM, 18, 341-343.
  • Hirschberg, D.S. (1977). Algorithms for the longest common subsequence problem. Journal of the ACM, 24(4), 664-675.
  • Lord, F.M. (1980). Applications of item response theory to practical testing problems. Erlbaum.
  • National Academies. (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_070895.pdf
  • National Center for Education Statistics (2015). The nation’s report card: 2015 mathematics and reading assessments. Publication No. NCES 2015136. Washington, DC: Author.
  • OpenAI. (2023). ChatGPT (May 24 version) [Large language model]. https://chat.openai.com/chat/
  • Organisation for Economic Co-operation and Development (2013). PISA 2015: Draft collaborative problem solving framework. Paris, France: Author.
  • Organisation for Economic Co-operation and Development (2015a). PISA 2015 released field trial cognitive items. Paris, France: Author.
  • Organisation for Economic Co-operation and Development (2015b). PISA 2015 field trial analysis report: Outcomes of the cognitive assessment (JT03371930). Paris, France: Author.
  • Organisation for Economic Co-operation and Development (2017). PISA 2015 Results (Volume V): Collaborative Problem Solving, PISA, OECD Publishing, Paris, France.
  • Qiao, X., Jiao, H. & He, Q. (2023). Multiple-Group Joint Modeling of Item Responses, Response Times, and Action Counts with the Conway-Maxwell-Poisson Distribution. Journal of Educational Measurement, 60(2), 255-281.
  • Quellmalz, E.S., Timms, M.J., & Schneider, S.A. (2009). Assessment of student learning in science simulations and games. Paper prepared for the National Research Council Workshop on Gaming and Simulations, Washington, DC.
  • Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests. Copenhagen, Denmark: Danish Institute for Educational Research.
  • Rosenbaum, P.R. (1988). Item bundles. Psychometrika, 53(3), 349-359.
  • Stecher, B.M., & Hamilton, L.S. (2014). Measuring hard-to-measure student competencies: A research and development plan, Research Report. RAND Corporation.
  • Ulitzsch, E., He, Q., Ulitzsch, V., Nichterlein, A., Molter, H., Niedermeier, R., & Pohl, S. (2021). Combining clickstream analyses and graph-modeled data clustering for identifying common response processes. Psychometrika, 86, 190-214.
  • Ulitzsch, E., Ulitzsch, V., He, Q., & Lüdtke, O. (2023). A machine learning-based procedure for leveraging clickstream data to investigate early predictability of failure on interactive tasks. Behavior Research Methods, 55, 1392–1412.
  • von Davier, M., Khorramdel, L., He, Q., Shin, H., & Chen, H. (2019). Developments in psychometric population models for data from innovative items. Journal of Educational and Behavioral Statistics, 44(6), 671-705.
  • Wildman, J.L., Thayer, A.L., Pavlas, D., Salas, E., Stewart, J.E., & Howse, W. (2012). Team knowledge research: Emerging trends and critical needs. Human Factors: The Journal of the Human Factors and Ergonomics Society, 54, 84-111.
  • Wilson, M., & Adams, R.J. (1995). Rasch models for item bundles. Psychometrika, 60(2), 181-198.
  • Xiao, Y., He, Q., Veldkamp, B.P., & Liu, H. (2021). Exploring Latent States of Problem-Solving Competence Using Hidden Markov Modeling on Process Data. Journal of Computer-Assisted Learning, 37(5), 1232-1247.
Toplam 36 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Eğitimde ve Psikolojide Ölçme Teorileri ve Uygulamaları
Bölüm Special Issue 2023
Yazarlar

Qiwei He 0000-0001-8942-2047

Yayımlanma Tarihi 27 Aralık 2023
Gönderilme Tarihi 20 Aralık 2023
Kabul Tarihi 24 Aralık 2023
Yayımlandığı Sayı Yıl 2023 Cilt: 10 Sayı: Special Issue

Kaynak Göster

APA He, Q. (2023). Collaborative problem-solving design in large-scale assessments: Shedding lights in sequential conversation-based measurement. International Journal of Assessment Tools in Education, 10(Special Issue), 194-207. https://doi.org/10.21449/ijate.1407315

23824         23823             23825