Automated Written Corrective Feedback (AWCF) tools have gained popularity in the instruction of writing in English as a foreign language (EFL) because of their ability to evaluate written drafts. Teachers have become interested in this aspect, as it can alleviate their workload, especially with lower-order concerns, such as vocabulary, grammar, and mechanics. However, little is known about EFL teachers' perspectives on automated feedback and how it can effectively complement their feedback regarding higher-order concerns, such as organization and content. For this purpose, this study aims to examine EFL teachers’ perceptions of the integration of Grammarly Premium as an AWCF tool for providing feedback on writing assignments, with a focus on addressing higher-order concerns (HOCs) and lower-order concerns (LOCs), particularly among undergraduate students. The study adopted a qualitative research design and employed semi-structured interviews with a sample of one pilot teacher and ten teachers at the tertiary level for the main study. The data obtained from the study was analyzed using MAXQDA 22. The results revealed that most participants responded favorably to AWCF and Grammarly. On the other hand, Grammarly is inefficient in terms of LOCs due to its incorrect vocabulary recommendations and tendency to highlight the same grammatical mistakes numerous times. Nevertheless, it is still found more useful in terms of LOCs compared to the aspects in HOCs because it failed to provide efficient feedback in terms of coherence/cohesion and still needs a human touch for this aspect. Further research can be conducted to investigate how Grammarly can be integrated into writing classes more efficiently, thereby limiting its drawbacks in terms of HOCs.
Bai, L., & Hu, G. (2016). In the face of fallible awe feedback: How do students respond? Educational Psychology, 37(1), 67–81. https://doi.org/10.1080/01443410.2016.1223275
Barrot, J. S. (2023). Using automated written corrective feedback in the writing classrooms: effects on L2 writing accuracy. Computer Assisted Language Learning, 36(4), 584-607. https://doi.org/10.1080/09588221.2021.1936071
Cavaleri, M., & Dianati, S. (2016). You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language and Learning, 10(1), A223-A236. https://journal.aall.org.au/index.php/jall/article/view/393
Chen, C., & Cheng, W. (2008). Beyond the design of automated writing evaluation: Pedagogical practices and perceived learning effectiveness in EFL writing classes. Language Learning & Technology, 12(2), 94–112. http://elc.msu.edu/llt/vol12num2/chencheng.pdf
Creswell, J. W. (2012). Educational research: Planning, conducting and evaluating quantitative and qualitative research. Pearson Education Inc.
Cotos, E. (2018). Automated writing evaluation. The TESOL Encyclopedia of English Language Teaching, 1–7. https://doi.org/10.1002/9781118784235.eelt0391
Dikli, S. (2006). An overview of automated scoring of essays. The Journal of Technology, Learning and Assessment, 5(1). https://ejournals.bc.edu/index.php/jtla/article/view/1640
Dikli, S., & Bleyle, S. (2016). Automated essay scoring feedback for second language writers: How does it compare to instructor feedback? Assessing Writing, 22,1-17. https://doi.org/10.1016/j.asw.2014.03.006
Dörnyei, Z. (2007). Research methods in applied linguistics: Quantitative, qualitative, and mixed methodologies. Oxford University Press.
Ene, E. & Upton, T. A. (2014) Learner uptake of teacher electronic feedback in ESL composition. System, 46, 80–95. https://doi.org/10.1016/j.system.2014.07.011
Ersanlı, C.Y., Yeşilel, D.B.A. (2022). The use of automated writing evaluation tools to foster ESL writing instruction. In G. Yangın-Ekşi, S. Akayoglu & L. Anyango (Eds.), New directions in technology for writing instruction (Vol. 30) (pp. 193-209). Springer. https://doi.org/10.1007/978-3-031-13540-8_10
Ferris, D. R. (2006) Does error feedback help student writers? New evidence on the short-and long-term effects of written error correction. In K. Hyland, & F. Hyland (Eds.), Feedback in second language writing: Contexts and issues (2nd ed.). Cambridge University Press, (pp. 81–104). https://doi.org/10.1017/CBO9781139524742.007
Fu, Q., Zou, D., Xie, H., & Cheng, G. (2022). A review of AWE feedback: Types, learning outcomes, and implications. Computer Assisted Language Learning, 1–43. https://doi.org/10.1080/09588221.2022.2033787
Grammarly. (2023). About Us. Retrieved April 21, 2023, from https://www.grammarly.com/about
Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of automated writing evaluation. The Journal of Technology, Learning, and Assessment, 8(6), 4-43. https://ejournals.bc.edu/index.php/jtla/article/view/1625.
Jiang, L., Yu, S., & Wang, C. (2020). Second language writing instructors’ feedback practice in response to automated writing evaluation: A sociocultural perspective. System, 93, 1-12. https://doi.org/10.1016/j.system.2020.102302
Koltovskaia, S. (2022). Postsecondary L2 writing teachers’ use and perceptions of Grammarly as a complement to their feedback. ReCALL, 1-15. https://doi.org/10.1017/S0958344022000179
Li, Z. (2021). Teachers in automated writing evaluation (AWE) system-supported ESL writing classes: Perception, implementation, and Influence. System, 99, 102505. https://doi.org/10.1016/j.system.2021.102505
Li, R. (2023). Still a fallible tool? Revisiting effects of automated writing evaluation from activity theory perspective. British Journal of Educational Technology, 54(3), 773-789. https://doi.org/10.1111/bjet.13294
Li, Z., Link, S., Ma, H., Yang, H., & Hegelheimer, V. (2014). The role of automated writing evaluation holistic scores in the ESL classroom. System, 44, 66–78. https://doi.org/10.1016/j.system.2014.02.007
Link, S., Dursun, A., Karakaya, K., & Hegelheimer, V. (2014). Towards better ESL practices for implementing automated writing evaluation. CALICO Journal, 31(3), 323–344. https://doi.org/10.11139/cj.31.3.323-344
Link, S., Mehrzad, M. & Rahimi, M. (2020). Impact of automated writing evaluation on teacher feedback, student revision, and writing improvement. Computer Assisted Language Learning, 35(4), 605–634. https://doi.org/10.1080/09588221.2020.1743323
Merriam, S. B. & Tisdell, E. J. (2015). Qualitative research: A guide to design and implementation (4th ed.). John Wiley & Sons.
McCombes, S. (2023). What is a research design. Types, guide & examples. Scribbr. https://www.scribbr.com/methodology/research-design/
Ranalli, J. (2018). Automated written corrective feedback: How well can students make use of it? Computer Assisted Language Learning, 31(7), 653–674. https://doi.org/10.1080/09588221.2018.1428994
Ranalli, J., & Yamashita, T. (2022). Automated written corrective feedback: Error correction performance and timing of delivery. Language Learning & Technology, 26(1), 1–25. http://hdl.handle.net/10125/73465
Silverman, D. (2016). Qualitative research (4th ed.). Sage Publications.
Stevenson, M. (2016). A critical interpretative synthesis: The integration of automated writing evaluation into classroom writing instruction. Computers and Composition, 42, 1–16. https://doi.org/10.1016/j.compcom.2016.05.001
Stevenson, M., & Phakiti, A. (2019). Automated feedback and second language writing. Cambridge University Press eBooks, 125–142. https://doi.org/10.1017/9781108635547.009
Tawney, J. W., & Gast, D. L. (1984). Single subject research design in special education. OMerill Publishing Company.
Thi, K. N., & Nikolov, M. (2022). How teacher and Grammarly feedback complement one another in Myanmar EFL students’ writing. The Asia-Pacific Education Researcher, 31(6), 767-779. https://doi.org/10.1007/s40299-021-00625-2
Qassemzadeh, A., & Soleimani, H. (2016). The impact of feedback provision by Grammarly software and teachers on learning passive structures by Iranian EFL learners. Theory and Practice in Language Studies, 6(9), 1884-1894. http://dx.doi.org/10.17507/tpls.0609.23
Wang, P. L. (2015). Effects of an automated writing evaluation program: Student experiences and perceptions. Electronic Journal of Foreign Language Teaching, 12(1), 79–100. https://e-flt.nus.edu.sg/v12n12015/wang.pdf
Wang, Y. J., Shang, H. F., & Briody, P. (2013). Exploring the impact of using automated writing evaluation in English as a foreign language university students' writing. Computer Assisted Language Learning, 26(3), 234–257. https://doi.org/10.1080/09588221.2012.655300
Warschauer, M. & Grimes, D. (2008) Automated writing assessment in the classroom. Pedagogies: An International Journal, 3(1), 22–36. https://doi.org/10.1080/15544800701771580
Weigle, S. C. (2013). English language learners and automated scoring of essays: Critical considerations. Assessing Writing, 18(1), 85–99. https://doi.org/10.1016/j.asw.2012.10.006
Wilson, J. M., & Czik, A. (2016). Automated essay evaluation software in English Language Arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers & Education, 100, 94–109. https://doi.org/10.1016/j.compedu.2016.05.004
Wilson, J., Ahrendt, C., Fudge, E. A., Raiche, A., Beard, G., & MacArthur, C. (2021). Elementary teachers’ perceptions of automated feedback and automated scoring: Transforming the teaching and learning of writing using automated writing evaluation. Computers & Education, 168, 104208. https://doi.org/10.1016/j.compedu.2021.104208
İngilizce Öğretmenlerinin Otomatik Yazılı Düzeltici Geribildirim ve Grammarly Kullanımına İlişkin Görüşleri
Year 2023,
Volume: 5 Issue: 3 - Türkiye Cumhuriyeti 100.Yıl Özel Sayısı, 1183 - 1198, 29.10.2023
Otomatik Yazılı Düzeltici Geribildirimi (OYDG) araçları, yazma öğretiminde, İngilizceyi yabancı dil olarak öğrenenler için yazılan ödevleri değerlendirme kabiliyeti nedeniyle popülerlik kazanmıştır. Öğretmenler, özellikle kelime dağarcığı, dilbilgisi ve mekanik gibi daha düşük seviyedeki hataların bulunmasında iş yüklerini hafifletebileceği için bu yönüne ilgi göstermektedirler. Ancak, OYDG programlarının verdiği dönütlere yönelik İngilizceyi yabancı dil olarak öğreten öğretmenlerin bakış açıları hakkında az bilgi bulunmaktadır. Otomatik dönütlerin nasıl etkili bir şekilde üst düzeydeki hataların düzeltilmesinde, örneğin paragraf akışı ve içerik hataları olmak üzere bu kapsamda öğretmenlerin bakış açısı incelenmektedir. Bu amaçla, bu çalışma İngilizceyi yabancı dil olarak öğreten öğretmenlerin, yazma ödevlerine geri bildirim sağlamak amacıyla Grammarly Premium’un bir OYDG aracı olarak entegrasyonuna yönelik algılarını, özellikle lisans hazırlık öğrencileri arasında üst-düzey kategoriler (ÜDK) ve alt-düzey kategoriler (ADK) için çıkarımları ele almayı hedefleyerek incelemeyi amaçlamaktadır. Çalışma, nitel araştırma yöntemini benimsemiş ve pilot çalışma için üniversite düzeyinde derse giren bir öğretmen ve ana çalışma için 10 öğretmen ile yarı yapılandırılmış görüşmeler yapmıştır. Çalışmadan elde edilen veriler MAXQDA 22 kullanılarak analiz edilmiştir. Sonuçlar, çoğu katılımcının OYDG ve Grammarly'ye olumlu yaklaştığını ortaya koymaktadır. Öte yandan, Grammarly yanlış kelime önerileri ve aynı dilbilgisi hatalarını defalarca vurgulaması nedeniyle bazı katılımcılar tarafından ADK açısından yetersiz bulunmuştur. Bununla birlikte, özellikle metnin anlaşılabilirliği açısından etkili geri bildirim sağlayamaması rağmen ADK için çıkarımları saptayabildiği için ÜDKya kıyasla hala kullanışlı bulunmaktadır. Bu durumda, ÜDK açısından hala insan müdahalesine ihtiyaç duyulmaktadır. Grammarly'nin daha etkili bir şekilde yazma derslerine nasıl entegre edilebileceğini incelemek amacıyla daha fazla araştırma yapılabilir ve böylece ÜDK açısından dezavantajlarını sınırlamak mümkün olabilir.
Bai, L., & Hu, G. (2016). In the face of fallible awe feedback: How do students respond? Educational Psychology, 37(1), 67–81. https://doi.org/10.1080/01443410.2016.1223275
Barrot, J. S. (2023). Using automated written corrective feedback in the writing classrooms: effects on L2 writing accuracy. Computer Assisted Language Learning, 36(4), 584-607. https://doi.org/10.1080/09588221.2021.1936071
Cavaleri, M., & Dianati, S. (2016). You want me to check your grammar again? The usefulness of an online grammar checker as perceived by students. Journal of Academic Language and Learning, 10(1), A223-A236. https://journal.aall.org.au/index.php/jall/article/view/393
Chen, C., & Cheng, W. (2008). Beyond the design of automated writing evaluation: Pedagogical practices and perceived learning effectiveness in EFL writing classes. Language Learning & Technology, 12(2), 94–112. http://elc.msu.edu/llt/vol12num2/chencheng.pdf
Creswell, J. W. (2012). Educational research: Planning, conducting and evaluating quantitative and qualitative research. Pearson Education Inc.
Cotos, E. (2018). Automated writing evaluation. The TESOL Encyclopedia of English Language Teaching, 1–7. https://doi.org/10.1002/9781118784235.eelt0391
Dikli, S. (2006). An overview of automated scoring of essays. The Journal of Technology, Learning and Assessment, 5(1). https://ejournals.bc.edu/index.php/jtla/article/view/1640
Dikli, S., & Bleyle, S. (2016). Automated essay scoring feedback for second language writers: How does it compare to instructor feedback? Assessing Writing, 22,1-17. https://doi.org/10.1016/j.asw.2014.03.006
Dörnyei, Z. (2007). Research methods in applied linguistics: Quantitative, qualitative, and mixed methodologies. Oxford University Press.
Ene, E. & Upton, T. A. (2014) Learner uptake of teacher electronic feedback in ESL composition. System, 46, 80–95. https://doi.org/10.1016/j.system.2014.07.011
Ersanlı, C.Y., Yeşilel, D.B.A. (2022). The use of automated writing evaluation tools to foster ESL writing instruction. In G. Yangın-Ekşi, S. Akayoglu & L. Anyango (Eds.), New directions in technology for writing instruction (Vol. 30) (pp. 193-209). Springer. https://doi.org/10.1007/978-3-031-13540-8_10
Ferris, D. R. (2006) Does error feedback help student writers? New evidence on the short-and long-term effects of written error correction. In K. Hyland, & F. Hyland (Eds.), Feedback in second language writing: Contexts and issues (2nd ed.). Cambridge University Press, (pp. 81–104). https://doi.org/10.1017/CBO9781139524742.007
Fu, Q., Zou, D., Xie, H., & Cheng, G. (2022). A review of AWE feedback: Types, learning outcomes, and implications. Computer Assisted Language Learning, 1–43. https://doi.org/10.1080/09588221.2022.2033787
Grammarly. (2023). About Us. Retrieved April 21, 2023, from https://www.grammarly.com/about
Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of automated writing evaluation. The Journal of Technology, Learning, and Assessment, 8(6), 4-43. https://ejournals.bc.edu/index.php/jtla/article/view/1625.
Jiang, L., Yu, S., & Wang, C. (2020). Second language writing instructors’ feedback practice in response to automated writing evaluation: A sociocultural perspective. System, 93, 1-12. https://doi.org/10.1016/j.system.2020.102302
Koltovskaia, S. (2022). Postsecondary L2 writing teachers’ use and perceptions of Grammarly as a complement to their feedback. ReCALL, 1-15. https://doi.org/10.1017/S0958344022000179
Li, Z. (2021). Teachers in automated writing evaluation (AWE) system-supported ESL writing classes: Perception, implementation, and Influence. System, 99, 102505. https://doi.org/10.1016/j.system.2021.102505
Li, R. (2023). Still a fallible tool? Revisiting effects of automated writing evaluation from activity theory perspective. British Journal of Educational Technology, 54(3), 773-789. https://doi.org/10.1111/bjet.13294
Li, Z., Link, S., Ma, H., Yang, H., & Hegelheimer, V. (2014). The role of automated writing evaluation holistic scores in the ESL classroom. System, 44, 66–78. https://doi.org/10.1016/j.system.2014.02.007
Link, S., Dursun, A., Karakaya, K., & Hegelheimer, V. (2014). Towards better ESL practices for implementing automated writing evaluation. CALICO Journal, 31(3), 323–344. https://doi.org/10.11139/cj.31.3.323-344
Link, S., Mehrzad, M. & Rahimi, M. (2020). Impact of automated writing evaluation on teacher feedback, student revision, and writing improvement. Computer Assisted Language Learning, 35(4), 605–634. https://doi.org/10.1080/09588221.2020.1743323
Merriam, S. B. & Tisdell, E. J. (2015). Qualitative research: A guide to design and implementation (4th ed.). John Wiley & Sons.
McCombes, S. (2023). What is a research design. Types, guide & examples. Scribbr. https://www.scribbr.com/methodology/research-design/
Ranalli, J. (2018). Automated written corrective feedback: How well can students make use of it? Computer Assisted Language Learning, 31(7), 653–674. https://doi.org/10.1080/09588221.2018.1428994
Ranalli, J., & Yamashita, T. (2022). Automated written corrective feedback: Error correction performance and timing of delivery. Language Learning & Technology, 26(1), 1–25. http://hdl.handle.net/10125/73465
Silverman, D. (2016). Qualitative research (4th ed.). Sage Publications.
Stevenson, M. (2016). A critical interpretative synthesis: The integration of automated writing evaluation into classroom writing instruction. Computers and Composition, 42, 1–16. https://doi.org/10.1016/j.compcom.2016.05.001
Stevenson, M., & Phakiti, A. (2019). Automated feedback and second language writing. Cambridge University Press eBooks, 125–142. https://doi.org/10.1017/9781108635547.009
Tawney, J. W., & Gast, D. L. (1984). Single subject research design in special education. OMerill Publishing Company.
Thi, K. N., & Nikolov, M. (2022). How teacher and Grammarly feedback complement one another in Myanmar EFL students’ writing. The Asia-Pacific Education Researcher, 31(6), 767-779. https://doi.org/10.1007/s40299-021-00625-2
Qassemzadeh, A., & Soleimani, H. (2016). The impact of feedback provision by Grammarly software and teachers on learning passive structures by Iranian EFL learners. Theory and Practice in Language Studies, 6(9), 1884-1894. http://dx.doi.org/10.17507/tpls.0609.23
Wang, P. L. (2015). Effects of an automated writing evaluation program: Student experiences and perceptions. Electronic Journal of Foreign Language Teaching, 12(1), 79–100. https://e-flt.nus.edu.sg/v12n12015/wang.pdf
Wang, Y. J., Shang, H. F., & Briody, P. (2013). Exploring the impact of using automated writing evaluation in English as a foreign language university students' writing. Computer Assisted Language Learning, 26(3), 234–257. https://doi.org/10.1080/09588221.2012.655300
Warschauer, M. & Grimes, D. (2008) Automated writing assessment in the classroom. Pedagogies: An International Journal, 3(1), 22–36. https://doi.org/10.1080/15544800701771580
Weigle, S. C. (2013). English language learners and automated scoring of essays: Critical considerations. Assessing Writing, 18(1), 85–99. https://doi.org/10.1016/j.asw.2012.10.006
Wilson, J. M., & Czik, A. (2016). Automated essay evaluation software in English Language Arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers & Education, 100, 94–109. https://doi.org/10.1016/j.compedu.2016.05.004
Wilson, J., Ahrendt, C., Fudge, E. A., Raiche, A., Beard, G., & MacArthur, C. (2021). Elementary teachers’ perceptions of automated feedback and automated scoring: Transforming the teaching and learning of writing using automated writing evaluation. Computers & Education, 168, 104208. https://doi.org/10.1016/j.compedu.2021.104208
Ayan, A. D., & Erdemir, N. (2023). EFL Teachers’ Perceptions of Automated Written Corrective Feedback and Grammarly. Ahmet Keleşoğlu Eğitim Fakültesi Dergisi, 5(3), 1183-1198. https://doi.org/10.38151/akef.2023.106