Research Article
BibTex RIS Cite
Year 2022, Volume: 9 Issue: 3, 362 - 382, 01.05.2022
https://doi.org/10.17275/per.22.71.9.3

Abstract

References

  • Albe, V., Venturini, P., & Lascours, J. (2001). Electromagnetic concepts in mathematical representation of physics. Journal of Science Education and Technology, 10, 197-203.
  • Andrade, H., Wang, X. L., Du, Y., & Akawi, R. L. (2009). Rubric-referenced self-assessment and self-efficacy for writing. Journal of Educational Research, 102(4), 287-301.
  • Andrade, H., & Du, Y. (2005). Student perspectives on rubric-referenced assessment. Practical Assessment, Research & Evaluation. http://pareonline.net/getvn.asp?v=10&n=3.
  • Angell, K. (2015). The application of reliability and validity measures to assess the effectiveness of an undergraduate citation rubric. Behavioral and Social Sciences Librarian, 34(1), 2-15.
  • Beer, C. P. (2010). How do pre-service teachers picture various electromagnetic phenomenon? A qualitative study of pre-service teachers’ conceptual understanding of fundamental electromagnetic interaction. [Doctoral dissertation]. Ball State University.
  • Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, 67(3), 343-368.
  • Caleon, I., & Subramaniam, R. (2005). The impact of a cryogenics-based enrichment programme on attitude towards science and the learning of science concepts. International Journal of Science Education, 27(6), 679-704.
  • Cheng, M. W. T., & Chan, C. K. Y. (2019). An experimental test: Using rubrics for reflective writing to develop reflection. Studies in Educational Evaluation, 61, 176-182.
  • Claesgens, C., Scalise, K., Wilson, M., & Stacy, A. (2009). Mapping student understanding in chemistry: The perpectives of chemists. Science Education, 93(1), 56-85.
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. New York: Lawrance Erlbaum Associates.
  • Dattalo, P. (2008). Determining sample size: Balancing power, precision and practicality. Oxford: Oxford University Press.
  • Dori, Y. J., & Belcher, J. (2005). How does technology-enabled active learning affect undergraduate students’ understanding of electromagnetism concepts? Journal of the Learning Sciences, 14(2), 243-279.
  • Gaigher, E., Rogan, J. M., & Braun, M. W. H. (2007). Exploring the development of conceptual understanding through structured problem-solving in physics. International Journal of Science Education, 29(9), 1089-1110.
  • Gray, J. S., Connolly, J. P., & Brown, M. A. (2019). Measuring intercultural knowledge and competence in college essays: Does a performance-based rubric have construct validity? Studies in Educational Evaluation, 62, 142-148.
  • Hafner, J. C., & Hafner, P. M. (2003). Quantitative analysis of the rubric as an assessment tool: An empirical study of student peer-group rating. International Journal of Science Education, 25(12), 1509-1528.
  • Härtel, H. (2018). Electromagnetic induction: An alternative for teaching and understanding. European Journal of Physics Education, 9(2), 1-13.
  • Howell, R. J. (2014). Grading rubrics: Hoopla or help? Innovations in Education and Teaching International, 51(4), 400-410.
  • Hull, M. M., Kuo, E., Gupta, A., & Elby, A. (2013). Problem-solving rubrics revisited: Attending to the blending of informal conceptual and formal mathematical reasoning. Physical Review Special Topics-Physics Education Research, 9(1), 010105-1-16.
  • Jonsson, A. (2014). Rubrics as a way of providing transparency in assessment. Assessment and Evaluation in Higher Education, 39(7), 840-852.
  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2, 130-144.
  • Karam, R. (2014). Framing the structural role of mathematics in physics lectures: A case study on electromagnetism. Physical Review Special Topics-Physics Education Research, 10(1), 010119-1-23.
  • Kocakülah, M. S. (2010). Development and application of a rubric for evaluating students’ performance on Newton’s laws of motion. Journal of Science Education and Technology, 19(2), 146-164.
  • Koo, T. K., & Li, M. Y. (2016). A guideline of selecting and reporting intraclass correlation coefficients for reliability research. Journal of Chiropractic Medicine, 15(2), 155-163.
  • Larkin, J. H. (1985). Understanding problem representations and skill in physics. In S. F. Chipman, J. W. Segal & R. Glaser (Eds.), Thinking and learning skills: Research and open questions (pp.141-159). Hillsdale, NJ: Lawrance Erlbaum Associates.
  • Lee, C.-Y., & Cherner, T. S. (2015). A comprehensive evaluation rubric for assessing instructional apps. Journal of Information Technology Education: Research, 14, 21-53.
  • Menendez-Varela, J.-L., & Gregori-Giralt, E. (2018). Rubrics for developing students’ professional judgement: A study of sustainable assessment in arts education. Studies in Educational Evaluation, 58, 70-79.
  • Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational Research Review, 9, 129-144.
  • Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment and Evaluation in Higher Education, 35(4), 435-448.
  • Reynolds-Keefer, L. (2010). Rubric referenced assessment in teacher preparation: An opportunity to learn by using. Practical Assessment, Research & Evaluation. http://pareonline.net/getvn.asp?v=15&n=8.
  • Shadle, S. E., Brown, E. C., Towns, M. H., & Warner, D. L. (2012). A rubric for assessing students’ experimental problem-solving ability. Journal of Chemical Education, 89, 319-325.
  • Thong, W. M., & Gunstone, R. (2008). Some student conceptions of electromagnetic induction. Research in Science Education, 38, 31-44.
  • Weir, J. P. (2005). Quantifying test-retest reliability using the intraclass correlation coefficient and the SEM. Journal of Strength and Conditioning Research, 19(1), 231-240.
  • Wu, P.-H., Wu, H.-K., & Hsu, Y.-S. (2014). Establishing the criterion-related, construct and content validities of a simulation-based assessment of inquiry abilities. International Journal of Science Education, 36(10), 1630-1650.
  • Zajchowski, R., & Martin, J. (1993). Differences in the problem solving of stronger and weaker novices in physics: Knowledge strategies or knowledge structure. Journal of Research in Science Teaching, 30, 459-470.
  • Zuza, K., Almudi, J.-M., Leniz, A., & Guisasola, J. (2014). Addressing students’ difficulties with Faraday’s law: A guided problem solving approach. Physical Review Special Topics-Physics Education Research, 10(1), 010122-1-16.
  • Zuza, K., De Cock, M., van Kampen, P., Kelly, T., & Guisasola, J. (2020). Guiding students towards an understanding of the electromotive force concept in electromagnetic phenomena through a teaching-learning sequence. Physical Review Physics Education Research, 16(2), 020110-1-14.

Development and use of a rubric to assess undergraduates’ problem solutions in Physics

Year 2022, Volume: 9 Issue: 3, 362 - 382, 01.05.2022
https://doi.org/10.17275/per.22.71.9.3

Abstract

The aim of this study is to develop and apply a rubric to evaluate the solutions proposed for questions about electromagnetic induction belonging to university second year pre-service teachers. In this study which has pretest-posttest quasi-experimental design with control group, teaching of the topic of electromagnetic induction was applied to both groups with the same teaching method and a test consisting of four questions was applied before and after the teaching. 73 students in the experimental group were informed about the properties and usage of rubrics and asked to create a rubric. The effect of rubric on the success of students was examined by applying descriptive statistics and t-tests to the scores obtained from both tests. The validity and reliability of the scoring with rubric were analyzed by calculating the linear regression, t-test statistics, Pearson correlation, Intraclass correlation and Cronbach-alpha correlation coefficients. The results of the analyses show that the developed rubric was used consistently by the researcher and an independent coder and there was a high and significant (p=.000) relationship between the scores for all questions. In the inter-rater reliability analysis for each question, the lowest ICC coefficient was specified as .826. In light of the findings obtained from the study, it was concluded that the developed rubric helped to make consistent and stable ratings independent of the scorer, to determine the characteristics of the problem solutions and to increase the level of achievement of students. Another type of analytical rubric can be created and used for other topics of the area of Physics and the results can be compared with the findings of this study.

References

  • Albe, V., Venturini, P., & Lascours, J. (2001). Electromagnetic concepts in mathematical representation of physics. Journal of Science Education and Technology, 10, 197-203.
  • Andrade, H., Wang, X. L., Du, Y., & Akawi, R. L. (2009). Rubric-referenced self-assessment and self-efficacy for writing. Journal of Educational Research, 102(4), 287-301.
  • Andrade, H., & Du, Y. (2005). Student perspectives on rubric-referenced assessment. Practical Assessment, Research & Evaluation. http://pareonline.net/getvn.asp?v=10&n=3.
  • Angell, K. (2015). The application of reliability and validity measures to assess the effectiveness of an undergraduate citation rubric. Behavioral and Social Sciences Librarian, 34(1), 2-15.
  • Beer, C. P. (2010). How do pre-service teachers picture various electromagnetic phenomenon? A qualitative study of pre-service teachers’ conceptual understanding of fundamental electromagnetic interaction. [Doctoral dissertation]. Ball State University.
  • Brookhart, S. M., & Chen, F. (2015). The quality and effectiveness of descriptive rubrics. Educational Review, 67(3), 343-368.
  • Caleon, I., & Subramaniam, R. (2005). The impact of a cryogenics-based enrichment programme on attitude towards science and the learning of science concepts. International Journal of Science Education, 27(6), 679-704.
  • Cheng, M. W. T., & Chan, C. K. Y. (2019). An experimental test: Using rubrics for reflective writing to develop reflection. Studies in Educational Evaluation, 61, 176-182.
  • Claesgens, C., Scalise, K., Wilson, M., & Stacy, A. (2009). Mapping student understanding in chemistry: The perpectives of chemists. Science Education, 93(1), 56-85.
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. New York: Lawrance Erlbaum Associates.
  • Dattalo, P. (2008). Determining sample size: Balancing power, precision and practicality. Oxford: Oxford University Press.
  • Dori, Y. J., & Belcher, J. (2005). How does technology-enabled active learning affect undergraduate students’ understanding of electromagnetism concepts? Journal of the Learning Sciences, 14(2), 243-279.
  • Gaigher, E., Rogan, J. M., & Braun, M. W. H. (2007). Exploring the development of conceptual understanding through structured problem-solving in physics. International Journal of Science Education, 29(9), 1089-1110.
  • Gray, J. S., Connolly, J. P., & Brown, M. A. (2019). Measuring intercultural knowledge and competence in college essays: Does a performance-based rubric have construct validity? Studies in Educational Evaluation, 62, 142-148.
  • Hafner, J. C., & Hafner, P. M. (2003). Quantitative analysis of the rubric as an assessment tool: An empirical study of student peer-group rating. International Journal of Science Education, 25(12), 1509-1528.
  • Härtel, H. (2018). Electromagnetic induction: An alternative for teaching and understanding. European Journal of Physics Education, 9(2), 1-13.
  • Howell, R. J. (2014). Grading rubrics: Hoopla or help? Innovations in Education and Teaching International, 51(4), 400-410.
  • Hull, M. M., Kuo, E., Gupta, A., & Elby, A. (2013). Problem-solving rubrics revisited: Attending to the blending of informal conceptual and formal mathematical reasoning. Physical Review Special Topics-Physics Education Research, 9(1), 010105-1-16.
  • Jonsson, A. (2014). Rubrics as a way of providing transparency in assessment. Assessment and Evaluation in Higher Education, 39(7), 840-852.
  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2, 130-144.
  • Karam, R. (2014). Framing the structural role of mathematics in physics lectures: A case study on electromagnetism. Physical Review Special Topics-Physics Education Research, 10(1), 010119-1-23.
  • Kocakülah, M. S. (2010). Development and application of a rubric for evaluating students’ performance on Newton’s laws of motion. Journal of Science Education and Technology, 19(2), 146-164.
  • Koo, T. K., & Li, M. Y. (2016). A guideline of selecting and reporting intraclass correlation coefficients for reliability research. Journal of Chiropractic Medicine, 15(2), 155-163.
  • Larkin, J. H. (1985). Understanding problem representations and skill in physics. In S. F. Chipman, J. W. Segal & R. Glaser (Eds.), Thinking and learning skills: Research and open questions (pp.141-159). Hillsdale, NJ: Lawrance Erlbaum Associates.
  • Lee, C.-Y., & Cherner, T. S. (2015). A comprehensive evaluation rubric for assessing instructional apps. Journal of Information Technology Education: Research, 14, 21-53.
  • Menendez-Varela, J.-L., & Gregori-Giralt, E. (2018). Rubrics for developing students’ professional judgement: A study of sustainable assessment in arts education. Studies in Educational Evaluation, 58, 70-79.
  • Panadero, E., & Jonsson, A. (2013). The use of scoring rubrics for formative assessment purposes revisited: A review. Educational Research Review, 9, 129-144.
  • Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment and Evaluation in Higher Education, 35(4), 435-448.
  • Reynolds-Keefer, L. (2010). Rubric referenced assessment in teacher preparation: An opportunity to learn by using. Practical Assessment, Research & Evaluation. http://pareonline.net/getvn.asp?v=15&n=8.
  • Shadle, S. E., Brown, E. C., Towns, M. H., & Warner, D. L. (2012). A rubric for assessing students’ experimental problem-solving ability. Journal of Chemical Education, 89, 319-325.
  • Thong, W. M., & Gunstone, R. (2008). Some student conceptions of electromagnetic induction. Research in Science Education, 38, 31-44.
  • Weir, J. P. (2005). Quantifying test-retest reliability using the intraclass correlation coefficient and the SEM. Journal of Strength and Conditioning Research, 19(1), 231-240.
  • Wu, P.-H., Wu, H.-K., & Hsu, Y.-S. (2014). Establishing the criterion-related, construct and content validities of a simulation-based assessment of inquiry abilities. International Journal of Science Education, 36(10), 1630-1650.
  • Zajchowski, R., & Martin, J. (1993). Differences in the problem solving of stronger and weaker novices in physics: Knowledge strategies or knowledge structure. Journal of Research in Science Teaching, 30, 459-470.
  • Zuza, K., Almudi, J.-M., Leniz, A., & Guisasola, J. (2014). Addressing students’ difficulties with Faraday’s law: A guided problem solving approach. Physical Review Special Topics-Physics Education Research, 10(1), 010122-1-16.
  • Zuza, K., De Cock, M., van Kampen, P., Kelly, T., & Guisasola, J. (2020). Guiding students towards an understanding of the electromotive force concept in electromagnetic phenomena through a teaching-learning sequence. Physical Review Physics Education Research, 16(2), 020110-1-14.
There are 36 citations in total.

Details

Primary Language English
Subjects Studies on Education
Journal Section Research Articles
Authors

Aysel Kocakülah 0000-0002-3472-4707

Publication Date May 1, 2022
Acceptance Date December 30, 2021
Published in Issue Year 2022 Volume: 9 Issue: 3

Cite

APA Kocakülah, A. (2022). Development and use of a rubric to assess undergraduates’ problem solutions in Physics. Participatory Educational Research, 9(3), 362-382. https://doi.org/10.17275/per.22.71.9.3