Sistematik Derlemeler ve Meta Analiz
BibTex RIS Kaynak Göster

Bilgi İşlemsel Düşünme Becerisinin Değerlendirilmesine İlişkin Sistematik Alanyazın Taraması

Yıl 2019, , 99 - 120, 20.12.2019
https://doi.org/10.38151/akef.597505

Öz

Yeni nesil bireylerde
bulunması beklenen beceriler listesine teknolojik gelişmelere paralel olarak
bilgi işlemsel düşünme (BİD) becerisi de eklenmiştir. Kısaca “teknolojiyi kullanarak
problem çözme” olarak tanımlayabileceğimiz bu beceriyi geliştirmek amacıyla
hazırlanan eğitimlerin etkilerinin belirlenmesi amacıyla sıra değerlendirmeye
geldiğinde, henüz kabul edilen, geçerli ve güvenilir yöntemlerin oluşmadığı
görülmektedir. Bu nedenle alanyazındaki farklı değerlendirme yöntemlerinin
belirlenmesinin, farklılıkların ortaya konmasının ve bu yöntemlerin
pozitif/negatif yönlerinin tartışılmasının geliştirilecek değerlendirme
yöntemleri için önemli katkı sağlayacağı düşünülmektedir. Bu amaçla Scholar
Google, Web of Science ve ERIC veri tabanlarından dahil etme ve çıkarma
kriterlerine uyan 47 araştırma analiz edilmek üzere seçilmiştir. Bu
araştırmaların BİD becerisini değerlendirmek amacıyla kullandıkları yöntemler,
yapılan içerik analizi sonucu şu başlıklarda incelenmiştir: i) en çok
değerlendirilen BİD becerisi bileşenleri, 
ii) veri toplama yöntemleri, iii) veri analiz yöntemleri, iv) veri
toplama araçlarının içerikleri, v) hedef kitle ve vi) değerlendirme
yöntemlerine ait geçerlik/güvenirlik çalışmaları. Analiz sonuçlarına göre BİD
becerisinin en fazla değerlendirilen bileşenleri; soyutlama, algoritmik
düşünme, ayrıştırma, test etme hata ayıklama ve veri okuryazarlığı olarak
sıralanmaktadır. Veri toplama yöntemi olarak ilk beş sırada görev, çoktan
seçmeli soru, proje, açık uçlu soru ve görüşmenin kullanıldığı, toplanan
verilerin çoğunlukla likert/rubriklerle analiz edildiği ortaya konmuştur. Veri
toplama araçlarının içeriklerinin ise programlama, algı/tutum, matematik, günlük
hayat problemleri, ve genel yetenekten oluştuğu belirlenmiştir. Değerlendirme
yöntemlerinin hedef kitlesini en fazla K-12 seviyesi oluşturmuştur. Ayrıca
incelen 47 araştırmadan altısına ait hem geçerlik, hem güvenirlik çalışmalarına
ulaşılmıştır. Belirlenen birbirinden farklı değerlendirme yöntemleri
karşılaştırılmış, avantajlı ve dezavantajlı yönleri tartışılmıştır. Elde edilen
sonuçlara göre BİD becerisini değerlendirme yöntemlerine yönelik gelecek
araştırmalar için öneriler sunulmuştur.

Kaynakça

  • Aiken, J. M., Caballero, M. D., Douglas, S. S., Burk, J. B., Scanlon, E. M., Thoms, B. D., et. al. (2013). Understanding student computational thinking with computational modeling. Proceedings of American Institute of Physics Conference, 1513 (1), 46-49.
  • Andrews, R., & Harlen, W. (2006). Issues in synthesizing research in education. Educational Research, 48(3), 287-299.
  • Anistyasari, Y., & Kurniawan, A. (2018). Exploring Computational Thinking to Improve Energy-Efficient Programming Skills. Proceedings of MATEC Web of Conferences, 197, 15011.
  • Araujo, A. L. S. O., Santos, J. S., Andrade, W. L., Guerrero, D. D. S., & Dagienė, V. (2017). Exploring computational thinking assessment in introductory programming courses. Proceedings of Frontiers in Education Conference, 1-9.
  • Atmatzidou, S., & Demetriadis, S. (2014). How to support students’ computational thinking skills in educational robotics activities. Proceedings of Forth International Workshop Teaching Robotics, Teaching with Robotics & Fifth International Conference Robotics in Education, 43-50.
  • Atmatzidou, S., & Demetriadis, S. (2016). Advancing students’ computational thinking skills through educational robotics: A study on age and gender relevant differences. Robotics and Autonomous Systems, 75, 661-670.
  • Bakırcı, H., & Kutlu, E. (2018). Fen bilimleri öğretmenlerinin FeTeMM yaklaşımı hakkındaki görüşlerinin belirlenmesi. Turkish Journal of Computer and Mathematics Education, 9(2), 367-389.
  • Barr, D., Harrison, J., & Conery, L. (2011). Computational thinking: A digital age skill for everyone. Learning & Leading with Technology, 38(6), 20-23.
  • Basogain, X., Olabe, M. A., Olabe, J. C., Ramírez, R., Del Rosario, M., & Garcia, J. (2016). PC-01: Introduction to computational thinking: Educational technology in primary and secondary education. Proceeding of International Symposium on Computers in Education, 1-5.
  • Bati, K., Yetişir, M. I., Çalişkan, I., Güneş, G., & Gül Saçan, E. (2018). Teaching the concept of time: A steam-based program on computational thinking in science education. Cogent Education, 5(1), 1-16.
  • Bell, T., Alexander, J., Freeman, I., & Grimley, M. (2009). Computer science unplugged: School students doing real computing without computers. The New Zealand Journal of Applied Computing and Information Technology, 13(1), 20-29.
  • Bers, M. U., Flannery, L., Kazakoff, E. R., & Sullivan, A. (2014). Computational thinking and tinkering: Exploration of an early childhood robotics curriculum. Computers & Education, 72, 145-157.
  • Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. Paper presented at the Annual Meeting of the American Educational Research Association, Vancouver, Canada.
  • Caldwell, H., & Smith, N. (Eds.). (2016). Teaching Computing Unplugged in Primary Schools: Exploring Primary Computing Through Practical Activities Away from the Computer. London: Sage Publication.
  • Chang, Z., Sun, Y., Wu, T. Y., & Guizani, M. (2018). Scratch analysis tool (SAT): A modern scratch project analysis tool based on ANTLR to assess computational thinking skills. Proceedings of the 14th International Wireless Communications & Mobile Computing Conference, 950-955.
  • Chen, G., Shen, J., Barth-Cohen, L., Jiang, S., Huang, X., & Eltoukhy, M. (2017). Assessing elementary students’ computational thinking in everyday reasoning and robotics programming. Computers & Education, 109, 162-175.
  • Dagiene, V., & Stupuriene, G. (2016). Bebras--A sustainable community building model for the concept based learning of informatics and computational thinking. Informatics in Education, 15(1), 25-44.
  • De Araujo, A. L. S. O., Andrade, W. L., & Guerrero, D. D. S. (2016). A systematic mapping study on assessing computational thinking abilities. Proceedings of Frontiers in Education Conference, 1-9.
  • Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107-115.
  • Ergai, A., Cohen, T., Sharp, J., Wiegmann, D., Gramopadhye, A., & Shappell, S. (2016). Assessment of the human factors analysis and classification system (HFACS): Intra-rater and inter-rater reliability. Safety Science, 82, 393-398.
  • Falloon, G. (2015). Building computational thinking through programming in K-6 education: A New Zealand experience. In L. Gomez Chova, A. Lopez Martinez, & I. Chandel Torres (Eds.), EDULearn zProceedings (pp. 882–892). Barcelona, Spain: IATED Academy.
  • Games, A., & Kane, L. (2012). Examining trends in adolescents’ computational thinking skills within the globaloria educational game design environment. Retrieved January 8, 2019, from http://www.worldwideworkshop.org/pdfs/GlobaloriaExaminingTrendsAdolescentsComputSkillsGamesKaneAug2012.pdf
  • Gouws, L. A., Bradshaw, K., & Wentworth, P. (2013a). Computational thinking in educational activities: an evaluation of the educational game light-bot. Proceedings of the 18th ACM Conference on Innovation and Technology in Computer Science Education, 10-15.
  • Gouws, L. A., Bradshaw, K., & Wentworth, P. (2013b). First year student performance in a test for computational thinking. Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference, 271-277.
  • Gretter, S., & Yadav, A. (2016). Computational thinking and media & information literacy: An integrated approach to teaching twenty first century skills. TechTrends, 60(5), 510-516.
  • Grover, S. (2015). “Systems of Assessments” for deeper learning of computational thinking in K-12. Proceedings of the 2015 Annual Meeting of the American Educational Research Association, 15-20.
  • Grover, S., Bienkowski, M., Basu, S., Eagle, M., Diana, N., & Stamper, J. (2017). A framework for hypothesis-driven approaches to support data-driven learning analytics in measuring computational thinking in block-based programming. Proceedings of the Seventh International Learning Analytics & Knowledge Conference, 530-531.
  • Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher, 42(1), 38-43.
  • Guba, E. G., & Lincoln, Y. S. (1982). Epistemological and methodological bases of naturalistic inquiry. Educational Communication and Technology Journal, 30 (4), 233-252
  • Hoover, A. K., Barnes, J., Fatehi, B., Moreno-León, J., Puttick, G., Tucker-Raymond, E., et. al. (2016). Assessing computational thinking in students' game designs. Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play, 173-179.
  • Hsu, T. C., Chang, S. C., & Hung, Y. T. (2018). How to learn and how to teach computational thinking: Suggestions based on a review of the literature. Computers & Education, 126, 296-310.
  • Hutchins, N., Biswas, G., Conlin, L., Emara, M., Grover, S., Basu, S., et. al. (2018). Studying synergistic learning of physics and computational thinking in a learning by modeling environment. Paper presented at the 26th International Conference on Computers in Education, Manila, Philippines.
  • ISTE. (2014). Computational thinking for all. Retrieved April 18, 2019, from https://www.iste.org/explore/articleDetail articleid=152&category=Solutions&article=Computational-thinking-for-all
  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130-144.
  • Jun, S., Han, S., & Kim, S. (2017). Effect of design-based learning on improving computational thinking. Behaviour & Information Technology, 36(1), 43-53.
  • Karaçam, Z. (2013). Sistematik derleme metodolojisi: Sistematik derleme hazırlamak için bir rehber. Dokuz Eylül Üniversitesi Hemşirelik Fakültesi Elektronik Dergisi, 6(1), 26-33.
  • Kalelioglu, F., Gülbahar, Y., & Kukul, V. (2016). A framework for computational thinking based on a systematic research review. Baltic Journal of Modern Computing, 4(3), 583-596.
  • Katchapakirin, K., & Anutariya, C. (2018). An architectural design of ScratchThAI: A conversational agent for computational thinking development using Scratch. Paper presented at the 10th International Conference on Advances in Information Technology, Bangkok, Thailand.
  • Kim, B., Kim, T., & Kim, J. (2013). Paper and pencil programming strategy toward computational thinking for non-majors: Design your solution. Journal of Educational Computing Research, 49(4), 437-459.
  • Kitchenham, B. (2004). Procedures for performing systematic reviews. (NICTA Technical Report 0400011T.1). Keele, UK: Keele University Department of Computer Science.
  • Koh, K. H., Basawapatna, A., Bennett, V., & Repenning, A. (2010). Towards the automatic recognition of computational thinking for adaptive visual language learning. Proceedings of Symposium on Visual Languages and Human-Centric Computing, 59-66.
  • Korkmaz, Ö., Çakir, R., & Özden, M. Y. (2017). A validity and reliability study of the computational thinking scales (CTS). Computers in Human Behavior, 72, 558-569.
  • Kramer, J. (2007). Is abstraction the key to computing?. Communications of the ACM, 50(4), 36-42.
  • Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159-174.
  • Lee, T. Y., Mauriello, M. L., Ahn, J., & Bederson, B. B. (2014). CTArcade: Computational thinking with games in school age children. International Journal of Child-Computer Interaction, 2(1), 26-33.
  • Leonard, J., Buss, A., Gamboa, R., Mitchell, M., Fashola, O. S., Hubert, T., et. al. (2016). Using robotics and game design to enhance children’s self-efficacy, STEM attitudes, and computational thinking skills. Journal of Science Education and Technology, 25(6), 860-876.
  • Leonard, J., Mitchell, M., Barnes-Johnson, J., Unertl, A., Outka-Hill, J., Robinson, R., et. al. (2018). Preparing teachers to engage rural students in computational thinking through robotics, game design, and culturally responsive teaching. Journal of Teacher Education, 69(4), 386-407.
  • Ling, U. L., Saibin, T. C., Naharu, N., Labadin, J., & Aziz, N. A. (2018). An evaluation tool to measure computational thinking skills: pilot investigation. National Academy of Managerial Staff of Culture and Arts Herald, 1, 606-614.
  • Lockwood, J., & Mooney, A. (2018). Computational thinking in secondary education: Where does it fit? A systematic literary review. International Journal of Computer Science Education in Schools, 2(1), 41-60.
  • Looi, C. K., How, M. L., Longkai, W., Seow, P., & Liu, L. (2018). Analysis of linkages between an unplugged activity and the development of computational thinking. Computer Science Education, 28(3), 255-279.
  • McHugh, M. L. (2012). Interrater reliability: the kappa statistic. Biochemia Medica, 22(3), 276-282.
  • Moreno-León, J., & Robles, G. (2015). Analyze your Scratch projects with Dr. Scratch and assess your computational thinking skills. Paper presented at the Scratch Conference, Amsterdam.
  • Román-González, M., Moreno-León, J., & Robles, G. (2017). Complementary tools for computational thinking assessment. Proceedings of International Conference on Computational Thinking Education, 154-159.
  • Nesiba, N., Pontelli, E., & Staley, T. (2015, October). DISSECT: Exploring the relationship between computational thinking and English literature in K-12 curricula. Proceedings of Frontiers in Education Conference, 1-8.
  • OECD. (2018). The future of education and skills: Education 2030. Retrieved July 22, 2019, from https://www.oecd.org/education/2030/E2030%20Position%20Paper%20(05.04.2018).pdf
  • Orton, K., Weintrop, D., Beheshti, E., Horn, M., Jona, K., & Wilensky, U. (2016). Bringing computational thinking into high school mathematics and science classrooms. Proceedings of 12th International Conference of the Learning Sciences, 2, 705-712.
  • Pérez-Marín, D., Hijón-Neira, R., Bacelo, A., & Pizarro, C. (2018). Can computational thinking be improved by using a methodology based on metaphors and scratch to teach computer programming to children?. Computers in Human Behavior, (in press).
  • Philip, M., Renumol, V. G., & Gopeekrishnan, R. (2013). A pragmatic approach to develop computational thinking skills in novices in computing education. Proceedings of International Conference in MOOC, Innovation and Technology in Education, 199-204.
  • Proctor, C., & Blikstein, P. (2018). How Broad is Computational Thinking? A Longitudinal Study of Practices Shaping Learning in Computer Science. Proceedings of the 13th International Conference of the Learning Sciences, 1, 544-551.
  • Roadrangka, V., Yeany, R. H., & Padilla, M. J. (1983). The construction and validation of group assessment of logical thinking (GALT). Paper presented at the Annual Meeting of the National Association for Research in Science Teaching, Dallas, USA.
  • Román-González, M. (2015). Computational thinking test: Design guidelines and content validation. Proceedings of EDULEARN15 Conference, 2436-2444.
  • Romero, M., Lepage, A., & Lille, B. (2017). Computational thinking development through creative programming in higher education. International Journal of Educational Technology in Higher Education, 14(42), 1-15.
  • Rowe, E., Asbell-Clarke, J., Cunningham, K., & Gasca, S. (2017). Assessing implicit computational thinking in Zoombinis Gameplay: Pizza Pass, Fleens & Bubblewonder Abyss. Proceedings of the Annual Symposium on Computer-Human Interaction in Play, 195-200.
  • Seiter, L., & Foreman, B. (2013). Modeling the learning progressions of computational thinking of primary grade students. Proceedings of the Ninth Annual International ACM Conference on International Computing Education Research, 59-66.
  • Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18(2), 351-380.
  • Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22, 142-158.
  • Sondakh, D. E. (2018). Review of computational thinking assessment in higher education. Retrieved May 29, 2019, from https://www.researchgate.net/profile/Debby_Sondakh/publication/324984840_Review_of_Computational_Thinking_Assessment_in_Higher_Education/links/5af0378aa6fdcc8508b96527/Review-of-Computational-Thinking-Assessment-in-Higher-Education.pdf
  • Streubert, H. J., & Carpenter, D. R. (2011). Qualitative research in nursing. (5th Ed.). Philadelphia: Lippincott Williams ve Wilkins.
  • Wang, D., Wang, T., & Liu, Z. (2014). A tangible programming tool for children to cultivate computational thinking. The Scientific World Journal, 2014, 1-11.
  • Weese, J. L., & Feldhausen, R. (2017). STEM Outreach: Assessing computational thinking and problem solving. Paper presented at Annual American Society for Engineering Education Conference & Exposition, Ohio, USA.
  • Weinberg, A. E. (2013). Computational thinking: An investigation of the existing scholarship and research. Unpublished Doctoral Dissertation, Colorado State University, Colorado, USA.
  • Werner, L., Denner, J., & Campe, S. (2014). Using computer game programming to teach computational thinking skills. In K. Schrier (Ed.), Learning, Education and Games (pp.37-53). Pittsburgh: ETC Press.
  • Werner, L., Denner, J., Campe, S., & Kawamoto, D. C. (2012). The fairy performance assessment: measuring computational thinking in middle school. Proceedings of the 43rd ACM Technical Symposium on Computer Science Education, 215-220.
  • Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33-35.
  • Witherspoon, E. B., Higashi, R. M., Schunn, C. D., Baehr, E. C., & Shoop, R. (2017). Developing computational thinking through a virtual robotics programming curriculum. ACM Transactions on Computing Education, 18(1), 1-20.
  • Yağcı, M. (2019). A valid and reliable tool for examining computational thinking skills. Education and Information Technologies, 24(1), 929-951.
  • Yeh, K. C., Xie, Y., & Ke, F. (2011). Teaching computational thinking to non-computing majors using spreadsheet functions. Proceedings of Frontiers in Education Conference, 1-5.
  • Zhang, N., & Biswas, G. (2017). Assessing students computational thinking in a learning by modeling environment. Proceedings of International Conference on Computational Thinking Education, 11-16.
  • Zhong, B., Wang, Q., Chen, J., & Li, Y. (2016). An exploration of three-dimensional integrated assessment for computational thinking. Journal of Educational Computing Research, 53(4), 562-590.

A Systematic Literature Review on Assessing Computational Thinking

Yıl 2019, , 99 - 120, 20.12.2019
https://doi.org/10.38151/akef.597505

Öz

Computational
thinking (CT) has been added to the list of expected skills from individuals in
parallel with technological advances. There is no fully-accepted, valid, and
reliable methods to evaluate this skill which can be defined as “problem
solving with using technology”. Therefore, identifying the different assessment
methods, revealing their differences, and discussing the positive/negative
aspects of these methods will contribute to the new assessment methods to be
developed. In order to achieve this purpose, 47 studies that meet the
inclusion/exclusion criteria from Scholar Google, Web of Science and ERIC
databases were selected for analysis. After the content analysis, these studies
were examined under the following headings: i)the most evaluated CT components,
ii)data collection methods, iii)data analysis methods, iv)content of data
collection tools, v)audience, and vi)validity and reliability studies.
According to the results of the analysis, the most evaluated components of CT
can be listed as; abstraction, algorithmic thinking, decomposition, testing,
debugging and data literacy. The top five data collection methods were tasks,
multiple choice questions, projects, open-ended questions, and interviews. The
data collected by using these methods were mostly analyzed with Likert
scale/rubrics. The contents of the data collection tools consist of
programming, perception/attitude, mathematics, daily life problems, and general
ability. The target group of the assessment methods was mostly the K-12 level.
Additionally, there are 6 out of 47 studies which include both validity and
reliability were obtained. The advantages/disadvantages of different assessment
methods were discussed. Also, suggestions for future studies about the methods
of evaluating CT are presented.

Kaynakça

  • Aiken, J. M., Caballero, M. D., Douglas, S. S., Burk, J. B., Scanlon, E. M., Thoms, B. D., et. al. (2013). Understanding student computational thinking with computational modeling. Proceedings of American Institute of Physics Conference, 1513 (1), 46-49.
  • Andrews, R., & Harlen, W. (2006). Issues in synthesizing research in education. Educational Research, 48(3), 287-299.
  • Anistyasari, Y., & Kurniawan, A. (2018). Exploring Computational Thinking to Improve Energy-Efficient Programming Skills. Proceedings of MATEC Web of Conferences, 197, 15011.
  • Araujo, A. L. S. O., Santos, J. S., Andrade, W. L., Guerrero, D. D. S., & Dagienė, V. (2017). Exploring computational thinking assessment in introductory programming courses. Proceedings of Frontiers in Education Conference, 1-9.
  • Atmatzidou, S., & Demetriadis, S. (2014). How to support students’ computational thinking skills in educational robotics activities. Proceedings of Forth International Workshop Teaching Robotics, Teaching with Robotics & Fifth International Conference Robotics in Education, 43-50.
  • Atmatzidou, S., & Demetriadis, S. (2016). Advancing students’ computational thinking skills through educational robotics: A study on age and gender relevant differences. Robotics and Autonomous Systems, 75, 661-670.
  • Bakırcı, H., & Kutlu, E. (2018). Fen bilimleri öğretmenlerinin FeTeMM yaklaşımı hakkındaki görüşlerinin belirlenmesi. Turkish Journal of Computer and Mathematics Education, 9(2), 367-389.
  • Barr, D., Harrison, J., & Conery, L. (2011). Computational thinking: A digital age skill for everyone. Learning & Leading with Technology, 38(6), 20-23.
  • Basogain, X., Olabe, M. A., Olabe, J. C., Ramírez, R., Del Rosario, M., & Garcia, J. (2016). PC-01: Introduction to computational thinking: Educational technology in primary and secondary education. Proceeding of International Symposium on Computers in Education, 1-5.
  • Bati, K., Yetişir, M. I., Çalişkan, I., Güneş, G., & Gül Saçan, E. (2018). Teaching the concept of time: A steam-based program on computational thinking in science education. Cogent Education, 5(1), 1-16.
  • Bell, T., Alexander, J., Freeman, I., & Grimley, M. (2009). Computer science unplugged: School students doing real computing without computers. The New Zealand Journal of Applied Computing and Information Technology, 13(1), 20-29.
  • Bers, M. U., Flannery, L., Kazakoff, E. R., & Sullivan, A. (2014). Computational thinking and tinkering: Exploration of an early childhood robotics curriculum. Computers & Education, 72, 145-157.
  • Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. Paper presented at the Annual Meeting of the American Educational Research Association, Vancouver, Canada.
  • Caldwell, H., & Smith, N. (Eds.). (2016). Teaching Computing Unplugged in Primary Schools: Exploring Primary Computing Through Practical Activities Away from the Computer. London: Sage Publication.
  • Chang, Z., Sun, Y., Wu, T. Y., & Guizani, M. (2018). Scratch analysis tool (SAT): A modern scratch project analysis tool based on ANTLR to assess computational thinking skills. Proceedings of the 14th International Wireless Communications & Mobile Computing Conference, 950-955.
  • Chen, G., Shen, J., Barth-Cohen, L., Jiang, S., Huang, X., & Eltoukhy, M. (2017). Assessing elementary students’ computational thinking in everyday reasoning and robotics programming. Computers & Education, 109, 162-175.
  • Dagiene, V., & Stupuriene, G. (2016). Bebras--A sustainable community building model for the concept based learning of informatics and computational thinking. Informatics in Education, 15(1), 25-44.
  • De Araujo, A. L. S. O., Andrade, W. L., & Guerrero, D. D. S. (2016). A systematic mapping study on assessing computational thinking abilities. Proceedings of Frontiers in Education Conference, 1-9.
  • Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107-115.
  • Ergai, A., Cohen, T., Sharp, J., Wiegmann, D., Gramopadhye, A., & Shappell, S. (2016). Assessment of the human factors analysis and classification system (HFACS): Intra-rater and inter-rater reliability. Safety Science, 82, 393-398.
  • Falloon, G. (2015). Building computational thinking through programming in K-6 education: A New Zealand experience. In L. Gomez Chova, A. Lopez Martinez, & I. Chandel Torres (Eds.), EDULearn zProceedings (pp. 882–892). Barcelona, Spain: IATED Academy.
  • Games, A., & Kane, L. (2012). Examining trends in adolescents’ computational thinking skills within the globaloria educational game design environment. Retrieved January 8, 2019, from http://www.worldwideworkshop.org/pdfs/GlobaloriaExaminingTrendsAdolescentsComputSkillsGamesKaneAug2012.pdf
  • Gouws, L. A., Bradshaw, K., & Wentworth, P. (2013a). Computational thinking in educational activities: an evaluation of the educational game light-bot. Proceedings of the 18th ACM Conference on Innovation and Technology in Computer Science Education, 10-15.
  • Gouws, L. A., Bradshaw, K., & Wentworth, P. (2013b). First year student performance in a test for computational thinking. Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference, 271-277.
  • Gretter, S., & Yadav, A. (2016). Computational thinking and media & information literacy: An integrated approach to teaching twenty first century skills. TechTrends, 60(5), 510-516.
  • Grover, S. (2015). “Systems of Assessments” for deeper learning of computational thinking in K-12. Proceedings of the 2015 Annual Meeting of the American Educational Research Association, 15-20.
  • Grover, S., Bienkowski, M., Basu, S., Eagle, M., Diana, N., & Stamper, J. (2017). A framework for hypothesis-driven approaches to support data-driven learning analytics in measuring computational thinking in block-based programming. Proceedings of the Seventh International Learning Analytics & Knowledge Conference, 530-531.
  • Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher, 42(1), 38-43.
  • Guba, E. G., & Lincoln, Y. S. (1982). Epistemological and methodological bases of naturalistic inquiry. Educational Communication and Technology Journal, 30 (4), 233-252
  • Hoover, A. K., Barnes, J., Fatehi, B., Moreno-León, J., Puttick, G., Tucker-Raymond, E., et. al. (2016). Assessing computational thinking in students' game designs. Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play, 173-179.
  • Hsu, T. C., Chang, S. C., & Hung, Y. T. (2018). How to learn and how to teach computational thinking: Suggestions based on a review of the literature. Computers & Education, 126, 296-310.
  • Hutchins, N., Biswas, G., Conlin, L., Emara, M., Grover, S., Basu, S., et. al. (2018). Studying synergistic learning of physics and computational thinking in a learning by modeling environment. Paper presented at the 26th International Conference on Computers in Education, Manila, Philippines.
  • ISTE. (2014). Computational thinking for all. Retrieved April 18, 2019, from https://www.iste.org/explore/articleDetail articleid=152&category=Solutions&article=Computational-thinking-for-all
  • Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2(2), 130-144.
  • Jun, S., Han, S., & Kim, S. (2017). Effect of design-based learning on improving computational thinking. Behaviour & Information Technology, 36(1), 43-53.
  • Karaçam, Z. (2013). Sistematik derleme metodolojisi: Sistematik derleme hazırlamak için bir rehber. Dokuz Eylül Üniversitesi Hemşirelik Fakültesi Elektronik Dergisi, 6(1), 26-33.
  • Kalelioglu, F., Gülbahar, Y., & Kukul, V. (2016). A framework for computational thinking based on a systematic research review. Baltic Journal of Modern Computing, 4(3), 583-596.
  • Katchapakirin, K., & Anutariya, C. (2018). An architectural design of ScratchThAI: A conversational agent for computational thinking development using Scratch. Paper presented at the 10th International Conference on Advances in Information Technology, Bangkok, Thailand.
  • Kim, B., Kim, T., & Kim, J. (2013). Paper and pencil programming strategy toward computational thinking for non-majors: Design your solution. Journal of Educational Computing Research, 49(4), 437-459.
  • Kitchenham, B. (2004). Procedures for performing systematic reviews. (NICTA Technical Report 0400011T.1). Keele, UK: Keele University Department of Computer Science.
  • Koh, K. H., Basawapatna, A., Bennett, V., & Repenning, A. (2010). Towards the automatic recognition of computational thinking for adaptive visual language learning. Proceedings of Symposium on Visual Languages and Human-Centric Computing, 59-66.
  • Korkmaz, Ö., Çakir, R., & Özden, M. Y. (2017). A validity and reliability study of the computational thinking scales (CTS). Computers in Human Behavior, 72, 558-569.
  • Kramer, J. (2007). Is abstraction the key to computing?. Communications of the ACM, 50(4), 36-42.
  • Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159-174.
  • Lee, T. Y., Mauriello, M. L., Ahn, J., & Bederson, B. B. (2014). CTArcade: Computational thinking with games in school age children. International Journal of Child-Computer Interaction, 2(1), 26-33.
  • Leonard, J., Buss, A., Gamboa, R., Mitchell, M., Fashola, O. S., Hubert, T., et. al. (2016). Using robotics and game design to enhance children’s self-efficacy, STEM attitudes, and computational thinking skills. Journal of Science Education and Technology, 25(6), 860-876.
  • Leonard, J., Mitchell, M., Barnes-Johnson, J., Unertl, A., Outka-Hill, J., Robinson, R., et. al. (2018). Preparing teachers to engage rural students in computational thinking through robotics, game design, and culturally responsive teaching. Journal of Teacher Education, 69(4), 386-407.
  • Ling, U. L., Saibin, T. C., Naharu, N., Labadin, J., & Aziz, N. A. (2018). An evaluation tool to measure computational thinking skills: pilot investigation. National Academy of Managerial Staff of Culture and Arts Herald, 1, 606-614.
  • Lockwood, J., & Mooney, A. (2018). Computational thinking in secondary education: Where does it fit? A systematic literary review. International Journal of Computer Science Education in Schools, 2(1), 41-60.
  • Looi, C. K., How, M. L., Longkai, W., Seow, P., & Liu, L. (2018). Analysis of linkages between an unplugged activity and the development of computational thinking. Computer Science Education, 28(3), 255-279.
  • McHugh, M. L. (2012). Interrater reliability: the kappa statistic. Biochemia Medica, 22(3), 276-282.
  • Moreno-León, J., & Robles, G. (2015). Analyze your Scratch projects with Dr. Scratch and assess your computational thinking skills. Paper presented at the Scratch Conference, Amsterdam.
  • Román-González, M., Moreno-León, J., & Robles, G. (2017). Complementary tools for computational thinking assessment. Proceedings of International Conference on Computational Thinking Education, 154-159.
  • Nesiba, N., Pontelli, E., & Staley, T. (2015, October). DISSECT: Exploring the relationship between computational thinking and English literature in K-12 curricula. Proceedings of Frontiers in Education Conference, 1-8.
  • OECD. (2018). The future of education and skills: Education 2030. Retrieved July 22, 2019, from https://www.oecd.org/education/2030/E2030%20Position%20Paper%20(05.04.2018).pdf
  • Orton, K., Weintrop, D., Beheshti, E., Horn, M., Jona, K., & Wilensky, U. (2016). Bringing computational thinking into high school mathematics and science classrooms. Proceedings of 12th International Conference of the Learning Sciences, 2, 705-712.
  • Pérez-Marín, D., Hijón-Neira, R., Bacelo, A., & Pizarro, C. (2018). Can computational thinking be improved by using a methodology based on metaphors and scratch to teach computer programming to children?. Computers in Human Behavior, (in press).
  • Philip, M., Renumol, V. G., & Gopeekrishnan, R. (2013). A pragmatic approach to develop computational thinking skills in novices in computing education. Proceedings of International Conference in MOOC, Innovation and Technology in Education, 199-204.
  • Proctor, C., & Blikstein, P. (2018). How Broad is Computational Thinking? A Longitudinal Study of Practices Shaping Learning in Computer Science. Proceedings of the 13th International Conference of the Learning Sciences, 1, 544-551.
  • Roadrangka, V., Yeany, R. H., & Padilla, M. J. (1983). The construction and validation of group assessment of logical thinking (GALT). Paper presented at the Annual Meeting of the National Association for Research in Science Teaching, Dallas, USA.
  • Román-González, M. (2015). Computational thinking test: Design guidelines and content validation. Proceedings of EDULEARN15 Conference, 2436-2444.
  • Romero, M., Lepage, A., & Lille, B. (2017). Computational thinking development through creative programming in higher education. International Journal of Educational Technology in Higher Education, 14(42), 1-15.
  • Rowe, E., Asbell-Clarke, J., Cunningham, K., & Gasca, S. (2017). Assessing implicit computational thinking in Zoombinis Gameplay: Pizza Pass, Fleens & Bubblewonder Abyss. Proceedings of the Annual Symposium on Computer-Human Interaction in Play, 195-200.
  • Seiter, L., & Foreman, B. (2013). Modeling the learning progressions of computational thinking of primary grade students. Proceedings of the Ninth Annual International ACM Conference on International Computing Education Research, 59-66.
  • Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18(2), 351-380.
  • Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22, 142-158.
  • Sondakh, D. E. (2018). Review of computational thinking assessment in higher education. Retrieved May 29, 2019, from https://www.researchgate.net/profile/Debby_Sondakh/publication/324984840_Review_of_Computational_Thinking_Assessment_in_Higher_Education/links/5af0378aa6fdcc8508b96527/Review-of-Computational-Thinking-Assessment-in-Higher-Education.pdf
  • Streubert, H. J., & Carpenter, D. R. (2011). Qualitative research in nursing. (5th Ed.). Philadelphia: Lippincott Williams ve Wilkins.
  • Wang, D., Wang, T., & Liu, Z. (2014). A tangible programming tool for children to cultivate computational thinking. The Scientific World Journal, 2014, 1-11.
  • Weese, J. L., & Feldhausen, R. (2017). STEM Outreach: Assessing computational thinking and problem solving. Paper presented at Annual American Society for Engineering Education Conference & Exposition, Ohio, USA.
  • Weinberg, A. E. (2013). Computational thinking: An investigation of the existing scholarship and research. Unpublished Doctoral Dissertation, Colorado State University, Colorado, USA.
  • Werner, L., Denner, J., & Campe, S. (2014). Using computer game programming to teach computational thinking skills. In K. Schrier (Ed.), Learning, Education and Games (pp.37-53). Pittsburgh: ETC Press.
  • Werner, L., Denner, J., Campe, S., & Kawamoto, D. C. (2012). The fairy performance assessment: measuring computational thinking in middle school. Proceedings of the 43rd ACM Technical Symposium on Computer Science Education, 215-220.
  • Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33-35.
  • Witherspoon, E. B., Higashi, R. M., Schunn, C. D., Baehr, E. C., & Shoop, R. (2017). Developing computational thinking through a virtual robotics programming curriculum. ACM Transactions on Computing Education, 18(1), 1-20.
  • Yağcı, M. (2019). A valid and reliable tool for examining computational thinking skills. Education and Information Technologies, 24(1), 929-951.
  • Yeh, K. C., Xie, Y., & Ke, F. (2011). Teaching computational thinking to non-computing majors using spreadsheet functions. Proceedings of Frontiers in Education Conference, 1-5.
  • Zhang, N., & Biswas, G. (2017). Assessing students computational thinking in a learning by modeling environment. Proceedings of International Conference on Computational Thinking Education, 11-16.
  • Zhong, B., Wang, Q., Chen, J., & Li, Y. (2016). An exploration of three-dimensional integrated assessment for computational thinking. Journal of Educational Computing Research, 53(4), 562-590.
Toplam 79 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Alan Eğitimleri
Bölüm Araştırma Makaleleri
Yazarlar

Ezgi Tosik-gün 0000-0001-7747-1917

Tolga Güyer 0000-0001-9175-5043

Yayımlanma Tarihi 20 Aralık 2019
Kabul Tarihi 27 Ağustos 2019
Yayımlandığı Sayı Yıl 2019

Kaynak Göster

APA Tosik-gün, E., & Güyer, T. (2019). Bilgi İşlemsel Düşünme Becerisinin Değerlendirilmesine İlişkin Sistematik Alanyazın Taraması. Ahmet Keleşoğlu Eğitim Fakültesi Dergisi, 1(2), 99-120. https://doi.org/10.38151/akef.597505

28981289802580829733