Multi-Trait Multi-Method Matrices for the Validation of Creativity and Critical Thinking Assessments for Secondary School Students in England and Greece
Abstract
The aim of this paper is the validation of
measurement tools which assess critical thinking and creativity as general
constructs instead of subject-specific skills. Specifically, this research
examined whether there is convergent and discriminant (or divergent) validity
between measurement tools of creativity and critical thinking. For this
purpose, the multi-trait and multi-method matrix suggested by Campbell and
Fiske (1959) was used. This matrix presented the correlation of scores that
students obtain in different assessments in order to reveal whether the
assessments measure the same or different constructs. Specifically, the two
methods used were written and oral exams, and the two traits measured were
critical thinking and creativity. For the validation of the assessments, 30
secondary-school students in Greece and 21 in England completed the
assessments. The sample in both countries provided similar results. The
critical thinking tools demonstrated convergent validity when compared with
each other and discriminant validity with the creativity assessments.
Furthermore, creativity assessments which measure the same aspect of creativity
demonstrated convergent validity. To conclude, this research provided
indicators that critical thinking and creativity as general constructs can be measured in a valid way. However, since the
sample was small, further investigation of the validation of the assessment
tools with a bigger sample is recommended.
Keywords
References
- Australian Curriculum (n.d.) Critical and Creative Thinking. Available at: https://www.australiancurriculum.edu.au/f-10-curriculum/general-capabilities/critical-and-creative-thinking/ (access: 6 August 2017)
- BERA (2011). Ethical guidelines for educational research. British Educational Research Association. Available at BERA website: https://www.bera.ac.uk/researchers-resources/publications/ethical-guidelines-for-educational-research-2011 (access: 5 August 2017)
- Berliner, D. C. (2011). ‘The Context for Interpreting PISA Results in the USA: Negativism, Chauvinism, Misunderstanding, and the Potential to Distort the Educational Systems of Nations’. In Pereyra, M.A., Kotthoff, H. & Cowen, R. (ed.) PISA Under Examination: Changing Knowledge, Changing Tests, and Changing Schools. (pp. 77-96). Rotterdam: Sense Publishers
- Burton, S.J., Sudweeks, R.R., Merrill, P.G. & Wood, B. (1991). How to Prepare Better Multiple-Choice Test Items: Guidelines for University Faculty. Brigham Young University Testing Services and The Department of Instructional Science.
- Campbell, D.T. & Fiske, D.W. (1959). Convergent and Discriminant Validation by the Multitrait-multimethod matrix. Psychological Bulletin, 56 (2), 81-105
- Coe, R. (2012). ‘Conducting Your Research: Inference and Interpretation’. In Arthur, J., Waring, M., Coe, R. & Hedges. L.V. (ed.) Education Research: Methods and Methodologies. (pp. 41-52). London: Sage
- Cox, R.C. & Vargas, J.S. (1966). A comparison of Item Selection Techniques for Norm-Referenced and Criterion-Referenced Tests. University of Pittsburgh.
- Critical Thinking Society (2013). Defining Critical Thinking. Available at: http://www.criticalthinking.org/pages/defining-critical-thinking/766 (Accessed: 28 January 2015).
Details
Primary Language
English
Subjects
Studies on Education
Journal Section
Research Article
Authors
Ourania Maria Ventista
This is me
Durham University
United Kingdom
Publication Date
January 1, 2018
Submission Date
August 18, 2017
Acceptance Date
August 14, 2017
Published in Issue
Year 2018 Volume: 5 Number: 1
Cited By
A multi‐country comparison of lower secondary students' critical thinking under the International Baccalaureate and national curricula
Review of Education
https://doi.org/10.1002/rev3.3442Exploring the Relationship Between Critical Thinking and Creativity in University Students: Gender Differences and the Assessment of Skills
Education Sciences
https://doi.org/10.3390/educsci15040464Teachers’ questions during non-standardised oral exams – construct, assessed domain, and possible implications for validity
Educational Assessment, Evaluation and Accountability
https://doi.org/10.1007/s11092-026-09483-2