Research Article
BibTex RIS Cite

The use of on-screen calculator as a digital tool in technology-enhanced items

Year 2023, , 224 - 242, 27.12.2023
https://doi.org/10.21449/ijate.1368600

Abstract

In this study, the effect of using on-screen calculators on eighth grade students’ performance on two TIMSS 2019 Problem Solving and Inquiry Tasks items considered as examples of technology-enhanced items administered on computers was examined. For this purpose, three logistic regression models were run where the dependent variables were giving a correct response to the items and the independent variables were mathematics achievement and on-screen calculator use. The data of student from 12 countries and 4 benchmarking participants were analyzed and some comparisons were made based on the analyses. The results indicate that using on-screen calculators is positively associated with higher odds of giving correct responses for both items above and beyond students’ mathematics achievement scores. The results of this study promote the inclusion of on-screen calculator as a digital tool in technology-enhanced items that require problem solving.

References

  • Betts, J., Muntean, W., Kim, D., & Kao, S. (2022). Evaluating different scoring methods for multiple response items providing partial credit. Educational and Psychological Measurement, 82(1), 151–176. https://doi.org/10.1177/0013164421994636
  • Bozdogan, H. (1987). Model selection and Akaike’s Information Criterion (AIC): the general theory and its analytical extensions. Psychometrika, 52(3), 345–370. https://doi.org/10.1007/BF02294361
  • Brolin, H., & Bjork, L-E (1992). Introducing calculators in Swedish schools. In J.T. Fey & C. R. Hirsch (Eds.), Calculators in mathematics educationi (pp. 226–232). Reston, VA: National Council of Teachers of Mathematics.
  • Bryant, W. (2017). Developing a strategy for using technology-enhanced items in large-scale standardized tests. Practical Assessment, Research & Evaluation, 22(1).
  • Caro D.H. & Biecek P. (2022). intsvy: International Assessment Data Manager. R package version 2.6., URL https://CRAN.R-project.org/package=intsvy
  • Clyne, C.M. (2015). The effects of different scoring methodologies on item and test characteristics of technology-enhanced items [unpublished doctoral dissertation]. University of Kansas, Lawrence, Kansas. https://kuscholarworks.ku.edu/bitstream/handle/1808/21675/Clyne_ku_0099D_14314_DATA_1.pdf?sequence=1
  • Cohen, A.S. & Kim, S. (1992). Detecting calculator effects on item performance. Applied Measurement in Education, 5(4), 303 320. https://doi.org/10.1207/s15324818ame0504_2
  • Dick, T. (1988). The continuing calculator controversy. Arithmetic Teacher, 37–41.
  • Ellington, A.J. (2003). A meta-analysis of the effects of calculators on students’ achievement and attitude levels in precollege mathematics classes. Journal for Research in Mathematics Education, 34(5), 433–463. https://doi.org/10.2307/30034795
  • Eurydice (2011). Mathematics education in Europe: common challenges and national policies. http://keyconet.eun.org/c/document_library/get_file?uuid=e456b461-d3cd-4bd5-aabc-2cae2d4bfaf9&groupId=11028
  • Fishbein, B., Foy, P., & Yin, L. (2021). TIMSS 2019 User Guide for the International Database (2nd ed.). TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA). https://timssandpirls.bc.edu/timss2019/international-database
  • Foy, P., & LaRoche, S. (2020). Estimating standard errors in the TIMSS 2019 results. In M. O. Martin, M. von Davier, & I. V. S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report (pp. 14.1–14.60). TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).
  • Hembree, R., & Dessart, D.J. (1986). Effects of hand-held calculators in precollege mathematics education: A meta-analysis. Journal for Research in Mathematics Education, 17(2), 83–99. https://doi.org/10.2307/749255
  • Hembree, R., & Dessart, D.J. (1992). Research on calculators in mathematics education. In J.T. Fey & C.R. Hirsch (Eds.), Calculators in mathematics education: 1992 NCTM Yearbook (pp. 23–32). Reston, VA: The National Council of Teachers of Mathematics.
  • Hopkins, M.H. (1992). The use of calculators in assessment of mathematics. In T. Fey & C. R. Hirsch (Eds.), Calculators in mathematics education: 1992 NCTMYearbook (pp. 158–166). Reston, VA: The National Council of Teachers of Mathematics.
  • Impara, J.C., & Foster, D. (2006). Question and test development strategies to minimize test fraud. In S.M. Downing & T.M. Haladyna (Eds.), Handbook of test development (pp. 91–114). Lawrence Erlbaum Associates.
  • Koay, P.L. (2006). Calculator use in primary school mathematics: A Singapore perspective. The Mathematics Educator, 9(2), 97-111.
  • Lorié, W. (2016). Automated scoring of multicomponent tasks. In Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Handbook of research on technology tools for real-world skill development (p. 627–658). IGI Global. https://doi.org/10.4018/978-1-4666-9441-5.ch024
  • Loyd, B. H. (1991). Mathematics test performance: The effects of item type and calculator use. Applied Measurement in Education, 4(1), 11–22.
  • Ludwig, S., & Rausch, A. (2022). The relationship between problem-solving behaviour and performance – Analysing tool use and information retrieval in a computer-based office simulation. Journal of Computer Assisted Learning, 1 27. https://doi.org/10.1111/jcal.12770
  • Martin, M.O., Mullis, I.V.S., & Foy, P. (2017). TIMSS 2019 Assessment Design. In I.V.S. Mullis, & M.O. Martin (Eds.), TIMSS 2019 Assessment Frameworks (pp. 79–92). TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).
  • Martin, M.O., von Davier, M., & Mullis, I.V.S. (2020). Methods and Procedures: TIMSS 2019 Technical Report. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).
  • Mullis, I.V.S., Martin, M.O. (2017). TIMSS 2019 Assessment Frameworks. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA). https://timss2019.org/psi/
  • Mullis, I.V.S., Martin, M.O., Fishbein, B., Foy, P., & Moncaleano, S. (2021). Findings from the TIMSS 2019 problem solving and inquiry tasks. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA). https://timss2019.org/psi/
  • Mullis, I.V.S., Martin, M.O., Goh, S., & Cotter, K. (Eds.) (2016). TIMSS 2015 Encyclopedia: Education Policy and Curriculum in Mathematics and Science. Retrieved from Boston College, TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA). http://timssandpirls.bc.edu/timss2015/encyclopedia/
  • National Council of Teachers of Mathematics (2015). Calculation Use in Elementary Grades. https://www.nctm.org/Standards-and-Positions/Position-Statements/Calculator-Use-in-Elementary-Grades
  • Parshall, C.G., Harmes, J.C., Davey, T., & Pashley, P.J. (2010). Innovative items for computerized testing. In W.J. van der Linden & C.A.W. Glas (Eds.), Elements of adaptive testing (pp. 215–230). Springer. https://doi.org/10.1007/978-0-387-85461-8
  • Plunkett, S. (1978). Decomposition and all that rot. Mathematics in Schools, 8(3), 2–5.
  • R Core Team (2022). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/
  • Russell, M., & Moncaleano, S. (2019) Examining the use and construct fidelity of technology-enhanced items employed by K-12 testing programs. Educational Assessment, 24(4), 286–304. https://doi.org/10.1080/10627197.2019.1670055
  • Salles, F., Dos Santos, R., & Keskpaik, S. (2020). When didactics meet data science: process data analysis in large-scale mathematics assessment in France. Large Scale Assessment in Education 8(7). https://doi.org/10.1186/s40536-020-00085-y
  • Scalise, K., & Gifford, B. (2006). Computer-based assessment in e-learning: a framework for constructing “intermediate constraint” questions and tasks for technology platforms. Journal of Technology, Learning, and Assessment, 4(6).
  • Schnipke, D.L. (1995). Assessing speededness in computer-based tests using item response times [Unpublished doctoral dissertation]. Johns Hopkins University, Baltimore, MD.
  • Setzer, J.C., Wise, S.L., van den Heuvel, J.R., & Ling, G. (2013) An investigation of examinee test-taking effort on a large-scale assessment. Applied Measurement in Education, 26(1), 34–49. https://doi.org/10.1080/08957347.2013.739453
  • Sireci, S.G. & Zenisky, A.L. (2006). Innovative item formats in computer-based testing: In pursuit of improved construct representation. In S.M. Downing & T.M. Haladyna (Eds.), Handbook of test development (p. 329–348). Lawrence Erlbaum Associates.
  • Smith, T.J., & McKenna, C.M. (2013). A comparison of logistic regression pseudo R2 indices. General Linear Model Journal, 39(2), 17–26. http://www.glmj.org/archives/GLMJ_ 2014v39n2.html
  • Sparrow, L., Kershaw, L., & Jones, K. (1994). Issues in primary mathematics education: calculators: research and curriculum implications. Perth, Australia: Mathematics, Science & Technology Education Centre, Edith Cowan University.
  • Tarr, J.E., Uekawa, K., Mittag, K.C., & Lennex, L. (2000). A comparison of calculator use in eighth-grade mathematics classrooms in the United States, Japan, and Portugal: Results from the Third International Mathematics and Science Study. School Science and Mathematics, 100(3), 139–150. https://doi.org/10.1111/j.1949-8594.2000.tb17249.x
  • Ulitzsch, E., Domingue, B.W., Kapoor, R., Kanopka, K. and Rios, J.A. (2023). A probabilistic filtering approach to non-effortful responding. Educational Measurement: Issues and Practice. Advanced online publication. https://doi.org/10.1111/emip.12567
  • Vasquez, S., & McCabe, T.W. (2002). The effect of calculator usage in the learning of basic skills. Research and Teaching in Developmental Education, 19(1), 33–40.
  • Walcott, C., Stickles, P.R. (2012). Calculator Use on NAEP: A look at fourth- and eighth-grade mathematics achievement. School Science and Mathematics, 112(4), 241–254. https://doi.org/10.1111/j.1949-8594.2012.00140.x
  • WIDA (n.d.). (2023) 2022 2023 Accessibility & accommodations Manual. https://wida.wisc.edu/sites/default/files/resource/Accessibility-Accommodations-Manual.pdf
  • Williams, D. (1987). Using calculators in assessing mathematics achievement. Arithmetic Teacher, 34(2), 21-23.
  • Wise, S.L. (2017), Rapid-guessing behavior: its identification, interpretation, and implications. Educational Measurement: Issues and Practice, 36, 52 61. https://doi.org/10.1111/emip.12165
  • Wise, S.L., & Gao, L. (2017) A general approach to measuring test-taking effort on computer-based tests. Applied Measurement in Education, 30(4), 343 354, https://doi.org/10.1080/08957347.2017.1353992
  • Wise, S.L., Kingsbury, G.G., Thomason, J., & Kong, X. (2004, April). An investigation of motivation filtering in a statewide achievement testing program. Paper presented at the annual meeting of the National Council on Measurement in Education, San Diego, CA.
  • Wise, S.L., & Ma, L. (2012, April). Setting response time thresholds for a CAT item pool: The normative threshold method. Paper presented at the annual meeting of the National Council on Measurement in Education, Vancouver, Canada.
  • Wolfe, E.W. (2010). What impact does calculator use have on test results? Test, Measurement & Research Services Bulletin, 14, 1–6.

The use of on-screen calculator as a digital tool in technology-enhanced items

Year 2023, , 224 - 242, 27.12.2023
https://doi.org/10.21449/ijate.1368600

Abstract

In this study, the effect of using on-screen calculators on eighth grade students’ performance on two TIMSS 2019 Problem Solving and Inquiry Tasks items considered as examples of technology-enhanced items administered on computers was examined. For this purpose, three logistic regression models were run where the dependent variables were giving a correct response to the items and the independent variables were mathematics achievement and on-screen calculator use. The data of student from 12 countries and 4 benchmarking participants were analyzed and some comparisons were made based on the analyses. The results indicate that using on-screen calculators is positively associated with higher odds of giving correct responses for both items above and beyond students’ mathematics achievement scores. The results of this study promote the inclusion of on-screen calculator as a digital tool in technology-enhanced items that require problem solving.

References

  • Betts, J., Muntean, W., Kim, D., & Kao, S. (2022). Evaluating different scoring methods for multiple response items providing partial credit. Educational and Psychological Measurement, 82(1), 151–176. https://doi.org/10.1177/0013164421994636
  • Bozdogan, H. (1987). Model selection and Akaike’s Information Criterion (AIC): the general theory and its analytical extensions. Psychometrika, 52(3), 345–370. https://doi.org/10.1007/BF02294361
  • Brolin, H., & Bjork, L-E (1992). Introducing calculators in Swedish schools. In J.T. Fey & C. R. Hirsch (Eds.), Calculators in mathematics educationi (pp. 226–232). Reston, VA: National Council of Teachers of Mathematics.
  • Bryant, W. (2017). Developing a strategy for using technology-enhanced items in large-scale standardized tests. Practical Assessment, Research & Evaluation, 22(1).
  • Caro D.H. & Biecek P. (2022). intsvy: International Assessment Data Manager. R package version 2.6., URL https://CRAN.R-project.org/package=intsvy
  • Clyne, C.M. (2015). The effects of different scoring methodologies on item and test characteristics of technology-enhanced items [unpublished doctoral dissertation]. University of Kansas, Lawrence, Kansas. https://kuscholarworks.ku.edu/bitstream/handle/1808/21675/Clyne_ku_0099D_14314_DATA_1.pdf?sequence=1
  • Cohen, A.S. & Kim, S. (1992). Detecting calculator effects on item performance. Applied Measurement in Education, 5(4), 303 320. https://doi.org/10.1207/s15324818ame0504_2
  • Dick, T. (1988). The continuing calculator controversy. Arithmetic Teacher, 37–41.
  • Ellington, A.J. (2003). A meta-analysis of the effects of calculators on students’ achievement and attitude levels in precollege mathematics classes. Journal for Research in Mathematics Education, 34(5), 433–463. https://doi.org/10.2307/30034795
  • Eurydice (2011). Mathematics education in Europe: common challenges and national policies. http://keyconet.eun.org/c/document_library/get_file?uuid=e456b461-d3cd-4bd5-aabc-2cae2d4bfaf9&groupId=11028
  • Fishbein, B., Foy, P., & Yin, L. (2021). TIMSS 2019 User Guide for the International Database (2nd ed.). TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA). https://timssandpirls.bc.edu/timss2019/international-database
  • Foy, P., & LaRoche, S. (2020). Estimating standard errors in the TIMSS 2019 results. In M. O. Martin, M. von Davier, & I. V. S. Mullis (Eds.), Methods and Procedures: TIMSS 2019 Technical Report (pp. 14.1–14.60). TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).
  • Hembree, R., & Dessart, D.J. (1986). Effects of hand-held calculators in precollege mathematics education: A meta-analysis. Journal for Research in Mathematics Education, 17(2), 83–99. https://doi.org/10.2307/749255
  • Hembree, R., & Dessart, D.J. (1992). Research on calculators in mathematics education. In J.T. Fey & C.R. Hirsch (Eds.), Calculators in mathematics education: 1992 NCTM Yearbook (pp. 23–32). Reston, VA: The National Council of Teachers of Mathematics.
  • Hopkins, M.H. (1992). The use of calculators in assessment of mathematics. In T. Fey & C. R. Hirsch (Eds.), Calculators in mathematics education: 1992 NCTMYearbook (pp. 158–166). Reston, VA: The National Council of Teachers of Mathematics.
  • Impara, J.C., & Foster, D. (2006). Question and test development strategies to minimize test fraud. In S.M. Downing & T.M. Haladyna (Eds.), Handbook of test development (pp. 91–114). Lawrence Erlbaum Associates.
  • Koay, P.L. (2006). Calculator use in primary school mathematics: A Singapore perspective. The Mathematics Educator, 9(2), 97-111.
  • Lorié, W. (2016). Automated scoring of multicomponent tasks. In Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Handbook of research on technology tools for real-world skill development (p. 627–658). IGI Global. https://doi.org/10.4018/978-1-4666-9441-5.ch024
  • Loyd, B. H. (1991). Mathematics test performance: The effects of item type and calculator use. Applied Measurement in Education, 4(1), 11–22.
  • Ludwig, S., & Rausch, A. (2022). The relationship between problem-solving behaviour and performance – Analysing tool use and information retrieval in a computer-based office simulation. Journal of Computer Assisted Learning, 1 27. https://doi.org/10.1111/jcal.12770
  • Martin, M.O., Mullis, I.V.S., & Foy, P. (2017). TIMSS 2019 Assessment Design. In I.V.S. Mullis, & M.O. Martin (Eds.), TIMSS 2019 Assessment Frameworks (pp. 79–92). TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).
  • Martin, M.O., von Davier, M., & Mullis, I.V.S. (2020). Methods and Procedures: TIMSS 2019 Technical Report. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA).
  • Mullis, I.V.S., Martin, M.O. (2017). TIMSS 2019 Assessment Frameworks. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA). https://timss2019.org/psi/
  • Mullis, I.V.S., Martin, M.O., Fishbein, B., Foy, P., & Moncaleano, S. (2021). Findings from the TIMSS 2019 problem solving and inquiry tasks. TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA). https://timss2019.org/psi/
  • Mullis, I.V.S., Martin, M.O., Goh, S., & Cotter, K. (Eds.) (2016). TIMSS 2015 Encyclopedia: Education Policy and Curriculum in Mathematics and Science. Retrieved from Boston College, TIMSS & PIRLS International Study Center, Lynch School of Education and Human Development, Boston College and International Association for the Evaluation of Educational Achievement (IEA). http://timssandpirls.bc.edu/timss2015/encyclopedia/
  • National Council of Teachers of Mathematics (2015). Calculation Use in Elementary Grades. https://www.nctm.org/Standards-and-Positions/Position-Statements/Calculator-Use-in-Elementary-Grades
  • Parshall, C.G., Harmes, J.C., Davey, T., & Pashley, P.J. (2010). Innovative items for computerized testing. In W.J. van der Linden & C.A.W. Glas (Eds.), Elements of adaptive testing (pp. 215–230). Springer. https://doi.org/10.1007/978-0-387-85461-8
  • Plunkett, S. (1978). Decomposition and all that rot. Mathematics in Schools, 8(3), 2–5.
  • R Core Team (2022). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/
  • Russell, M., & Moncaleano, S. (2019) Examining the use and construct fidelity of technology-enhanced items employed by K-12 testing programs. Educational Assessment, 24(4), 286–304. https://doi.org/10.1080/10627197.2019.1670055
  • Salles, F., Dos Santos, R., & Keskpaik, S. (2020). When didactics meet data science: process data analysis in large-scale mathematics assessment in France. Large Scale Assessment in Education 8(7). https://doi.org/10.1186/s40536-020-00085-y
  • Scalise, K., & Gifford, B. (2006). Computer-based assessment in e-learning: a framework for constructing “intermediate constraint” questions and tasks for technology platforms. Journal of Technology, Learning, and Assessment, 4(6).
  • Schnipke, D.L. (1995). Assessing speededness in computer-based tests using item response times [Unpublished doctoral dissertation]. Johns Hopkins University, Baltimore, MD.
  • Setzer, J.C., Wise, S.L., van den Heuvel, J.R., & Ling, G. (2013) An investigation of examinee test-taking effort on a large-scale assessment. Applied Measurement in Education, 26(1), 34–49. https://doi.org/10.1080/08957347.2013.739453
  • Sireci, S.G. & Zenisky, A.L. (2006). Innovative item formats in computer-based testing: In pursuit of improved construct representation. In S.M. Downing & T.M. Haladyna (Eds.), Handbook of test development (p. 329–348). Lawrence Erlbaum Associates.
  • Smith, T.J., & McKenna, C.M. (2013). A comparison of logistic regression pseudo R2 indices. General Linear Model Journal, 39(2), 17–26. http://www.glmj.org/archives/GLMJ_ 2014v39n2.html
  • Sparrow, L., Kershaw, L., & Jones, K. (1994). Issues in primary mathematics education: calculators: research and curriculum implications. Perth, Australia: Mathematics, Science & Technology Education Centre, Edith Cowan University.
  • Tarr, J.E., Uekawa, K., Mittag, K.C., & Lennex, L. (2000). A comparison of calculator use in eighth-grade mathematics classrooms in the United States, Japan, and Portugal: Results from the Third International Mathematics and Science Study. School Science and Mathematics, 100(3), 139–150. https://doi.org/10.1111/j.1949-8594.2000.tb17249.x
  • Ulitzsch, E., Domingue, B.W., Kapoor, R., Kanopka, K. and Rios, J.A. (2023). A probabilistic filtering approach to non-effortful responding. Educational Measurement: Issues and Practice. Advanced online publication. https://doi.org/10.1111/emip.12567
  • Vasquez, S., & McCabe, T.W. (2002). The effect of calculator usage in the learning of basic skills. Research and Teaching in Developmental Education, 19(1), 33–40.
  • Walcott, C., Stickles, P.R. (2012). Calculator Use on NAEP: A look at fourth- and eighth-grade mathematics achievement. School Science and Mathematics, 112(4), 241–254. https://doi.org/10.1111/j.1949-8594.2012.00140.x
  • WIDA (n.d.). (2023) 2022 2023 Accessibility & accommodations Manual. https://wida.wisc.edu/sites/default/files/resource/Accessibility-Accommodations-Manual.pdf
  • Williams, D. (1987). Using calculators in assessing mathematics achievement. Arithmetic Teacher, 34(2), 21-23.
  • Wise, S.L. (2017), Rapid-guessing behavior: its identification, interpretation, and implications. Educational Measurement: Issues and Practice, 36, 52 61. https://doi.org/10.1111/emip.12165
  • Wise, S.L., & Gao, L. (2017) A general approach to measuring test-taking effort on computer-based tests. Applied Measurement in Education, 30(4), 343 354, https://doi.org/10.1080/08957347.2017.1353992
  • Wise, S.L., Kingsbury, G.G., Thomason, J., & Kong, X. (2004, April). An investigation of motivation filtering in a statewide achievement testing program. Paper presented at the annual meeting of the National Council on Measurement in Education, San Diego, CA.
  • Wise, S.L., & Ma, L. (2012, April). Setting response time thresholds for a CAT item pool: The normative threshold method. Paper presented at the annual meeting of the National Council on Measurement in Education, Vancouver, Canada.
  • Wolfe, E.W. (2010). What impact does calculator use have on test results? Test, Measurement & Research Services Bulletin, 14, 1–6.
There are 48 citations in total.

Details

Primary Language English
Subjects Measurement and Evaluation in Education (Other)
Journal Section Special Issue 2023
Authors

Ozge Ersan 0000-0003-0196-5472

Burcu Parlak 0000-0001-7515-7262

Publication Date December 27, 2023
Submission Date September 29, 2023
Published in Issue Year 2023

Cite

APA Ersan, O., & Parlak, B. (2023). The use of on-screen calculator as a digital tool in technology-enhanced items. International Journal of Assessment Tools in Education, 10(Special Issue), 224-242. https://doi.org/10.21449/ijate.1368600

23823             23825             23824