Research Article
BibTex RIS Cite

Year 2025, Volume: 8 Issue: 3, 313 - 329, 30.09.2025
https://doi.org/10.31681/jetol.1685183

Abstract

References

  • Andujar, A., & Spratt, M. (2023). Using AI to support CLIL teacher language. Journal of Research in Applied Linguistics, 14(2), 7–19. https://doi.org/10.22055/RALS.2023.45267.3177
  • Bengio, Y., Simard, P., & Frasconi, P. (1994). Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks, 5(2), 157–166.
  • Council of the European Union. (2021). Consolidated GDPR text. https://eur-lex.europa.eu
  • Coyle, D., Hood, P., & Marsh, D. (2010). CLIL: Content and language integrated learning. Cambridge University Press.
  • Dalton-Puffer, C. (2007). Discourse in content and language integrated learning (CLIL) classrooms. In D. Marsh & C. J. Ramos (Eds.), CLIL in practice (pp. 153–172). John Benjamins.
  • Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of deep bidirectional transformers for language understanding. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), 4171–4186.
  • Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press.
  • Graves, A., Mohamed, A.-R., & Hinton, G. (2013). Speech recognition with deep recurrent neural networks. Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 6645–6649. https://doi.org/10.1109/ICASSP.2013.6638947
  • Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural Computation, 9(8), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735
  • Jurafsky, D., & Martin, J. H. (2020). Speech and language processing (3rd ed.). Pearson.
  • Kahraman, A., & Korucu, A. T. (2022). Artificial intelligence applications in education: A content analysis of publications in the Web of Science database. Journal of Educational Technology and Online Learning, 5(2), 312–328. https://doi.org/10.31681/jetol.1100607
  • Karpathy, A. (2015, May 21). The unreasonable effectiveness of recurrent neural networks. Andrej Karpathy blog. http://karpathy.github.io/2015/05/21/rnn-effectiveness/
  • Mageira, K., Pittou, D., Papasalouros, A., Kotis, K., Zangogianni, P., & Daradoumis, A. (2022). Educational AI chatbots for content and language integrated learning. Applied Sciences, 12(7), 3239. https://doi.org/10.3390/app12073239
  • Marsh, D. (2002). CLIL/EMILE: The European dimension: Actions, trends and foresight potential. European Commission.
  • Mehisto, P., & Ting, T. (2017). CLIL essentials for secondary school teachers. Cambridge University Press.
  • Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013). Efficient estimation of word representations in vector space. arXiv. https://arxiv.org/abs/1301.3781
  • Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., & Dean, J. (2013). Distributed representations of words and phrases and their compositionality. Advances in Neural Information Processing Systems, 26, 3111–3119.
  • Nazzaro, A., Nazzaro, L., & Santini, C. (2025). LSTM-driven CLIL: Cybersecurity vocabulary learning with AI. Preprints. https://doi.org/10.20944/preprints202504.2124.v2
  • Prechelt, L. (1998). Early stopping—but when? In G. B. Orr & K.-R. Müller (Eds.), Neural networks: Tricks of the trade (pp. 55–69). Springer.
  • Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15, 1929–1958.
  • Swain, M. (2006). Languaging, agency, and collaboration in advanced second language proficiency. In H. Byrnes (Ed.), Advanced language learning: The contribution of Halliday and Vygotsky (pp. 95–108). Continuum.
  • Tonbuloğlu, B. (2023). An evaluation of the use of artificial intelligence applications in online education. Journal of Educational Technology & Online Learning, 6(4), 866–884. https://doi.org/10.31681/jetol.1335906
  • Van Lier, L. (2004). The ecology and semiotics of language learning. Springer.
  • Vygotsky, L. S. (1978a). Mind in society: The development of higher psychological processes (M. Cole, V. John-Steiner, S. Scribner, & E. Souberman, Eds.). Harvard University Press.
  • Vygotsky, L. S. (1978b). Mind in society: The development of higher psychological processes (M. Cole, V. John-Steiner, S. Scribner, & E. Souberman, Eds.). Harvard University Press. (Original works published 1930–1934)
  • Vygotsky, L. S. (1987). Thinking and speech. In R. W. Rieber & A. S. Carton (Eds.), The collected works of L. S. Vygotsky: Vol. 1. Problems of general psychology (pp. 39–285). Plenum Press.
  • Wadden, P., Andujar, A., Mehisto, P., & Ting, T. (2024). Creating and crafting CLIL content learning materials.JALT 2023 Post-Conference Publication, 1–15.

LSTM-driven CLIL: Cybersecurity vocabulary learning with AI

Year 2025, Volume: 8 Issue: 3, 313 - 329, 30.09.2025
https://doi.org/10.31681/jetol.1685183

Abstract

This study presents the development of a custom dataset of L2 gap-fill exercises designed to enhance Long Short-Term Memory (LSTM) neural networks in CLIL (Content and Language Integrated Learning) settings for subject-specific courses. Targeting English for Special Purposes (ESP) vocabulary in cybersecurity, privacy, and data protection, the model addresses the dual challenge of domain-specific context mastery and language practice through structured neural network training. The custom dataset of gap-fill exercises for this LSTM model enables simultaneous prediction of missing words and semantic classification, offering learners contextualized language training that is a core requirement of CLIL methodology. Experimental results validate the model’s efficacy, demonstrating its potential as an adaptive support tool for CLIL-based education. This framework establishes a novel synergy between AI-enhanced language learning and subject-specific instruction, providing a scalable template for integrating neural networks into CLIL pedagogy.

Ethical Statement

This research adheres to ethical guidelines and principles in conducting and publishing academic work. All participants in the study provided informed consent, and their privacy and confidentiality have been ensured throughout the research process. No experiments involving animals were conducted in this study. The data used in the research were obtained in compliance with ethical standards and with the appropriate permissions. Furthermore, the research follows the principles of integrity, transparency, and honesty in reporting the results. Any conflicts of interest have been disclosed, and proper acknowledgment has been given to all sources and contributors.

Supporting Institution

No external funding or institutional support was received for this research.

References

  • Andujar, A., & Spratt, M. (2023). Using AI to support CLIL teacher language. Journal of Research in Applied Linguistics, 14(2), 7–19. https://doi.org/10.22055/RALS.2023.45267.3177
  • Bengio, Y., Simard, P., & Frasconi, P. (1994). Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks, 5(2), 157–166.
  • Council of the European Union. (2021). Consolidated GDPR text. https://eur-lex.europa.eu
  • Coyle, D., Hood, P., & Marsh, D. (2010). CLIL: Content and language integrated learning. Cambridge University Press.
  • Dalton-Puffer, C. (2007). Discourse in content and language integrated learning (CLIL) classrooms. In D. Marsh & C. J. Ramos (Eds.), CLIL in practice (pp. 153–172). John Benjamins.
  • Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of deep bidirectional transformers for language understanding. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), 4171–4186.
  • Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press.
  • Graves, A., Mohamed, A.-R., & Hinton, G. (2013). Speech recognition with deep recurrent neural networks. Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 6645–6649. https://doi.org/10.1109/ICASSP.2013.6638947
  • Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural Computation, 9(8), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735
  • Jurafsky, D., & Martin, J. H. (2020). Speech and language processing (3rd ed.). Pearson.
  • Kahraman, A., & Korucu, A. T. (2022). Artificial intelligence applications in education: A content analysis of publications in the Web of Science database. Journal of Educational Technology and Online Learning, 5(2), 312–328. https://doi.org/10.31681/jetol.1100607
  • Karpathy, A. (2015, May 21). The unreasonable effectiveness of recurrent neural networks. Andrej Karpathy blog. http://karpathy.github.io/2015/05/21/rnn-effectiveness/
  • Mageira, K., Pittou, D., Papasalouros, A., Kotis, K., Zangogianni, P., & Daradoumis, A. (2022). Educational AI chatbots for content and language integrated learning. Applied Sciences, 12(7), 3239. https://doi.org/10.3390/app12073239
  • Marsh, D. (2002). CLIL/EMILE: The European dimension: Actions, trends and foresight potential. European Commission.
  • Mehisto, P., & Ting, T. (2017). CLIL essentials for secondary school teachers. Cambridge University Press.
  • Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013). Efficient estimation of word representations in vector space. arXiv. https://arxiv.org/abs/1301.3781
  • Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., & Dean, J. (2013). Distributed representations of words and phrases and their compositionality. Advances in Neural Information Processing Systems, 26, 3111–3119.
  • Nazzaro, A., Nazzaro, L., & Santini, C. (2025). LSTM-driven CLIL: Cybersecurity vocabulary learning with AI. Preprints. https://doi.org/10.20944/preprints202504.2124.v2
  • Prechelt, L. (1998). Early stopping—but when? In G. B. Orr & K.-R. Müller (Eds.), Neural networks: Tricks of the trade (pp. 55–69). Springer.
  • Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15, 1929–1958.
  • Swain, M. (2006). Languaging, agency, and collaboration in advanced second language proficiency. In H. Byrnes (Ed.), Advanced language learning: The contribution of Halliday and Vygotsky (pp. 95–108). Continuum.
  • Tonbuloğlu, B. (2023). An evaluation of the use of artificial intelligence applications in online education. Journal of Educational Technology & Online Learning, 6(4), 866–884. https://doi.org/10.31681/jetol.1335906
  • Van Lier, L. (2004). The ecology and semiotics of language learning. Springer.
  • Vygotsky, L. S. (1978a). Mind in society: The development of higher psychological processes (M. Cole, V. John-Steiner, S. Scribner, & E. Souberman, Eds.). Harvard University Press.
  • Vygotsky, L. S. (1978b). Mind in society: The development of higher psychological processes (M. Cole, V. John-Steiner, S. Scribner, & E. Souberman, Eds.). Harvard University Press. (Original works published 1930–1934)
  • Vygotsky, L. S. (1987). Thinking and speech. In R. W. Rieber & A. S. Carton (Eds.), The collected works of L. S. Vygotsky: Vol. 1. Problems of general psychology (pp. 39–285). Plenum Press.
  • Wadden, P., Andujar, A., Mehisto, P., & Ting, T. (2024). Creating and crafting CLIL content learning materials.JALT 2023 Post-Conference Publication, 1–15.
There are 27 citations in total.

Details

Primary Language English
Subjects Information Systems (Other), Instructional Design, Instructional Technologies, Lifelong learning
Journal Section Articles
Authors

Antonio Nazzaro 0009-0001-1463-8940

Catia Santini 0009-0006-9257-1346

Lidia Nazzaro 0000-0002-1858-859X

Publication Date September 30, 2025
Submission Date April 28, 2025
Acceptance Date June 18, 2025
Published in Issue Year 2025 Volume: 8 Issue: 3

Cite

APA Nazzaro, A., Santini, C., & Nazzaro, L. (2025). LSTM-driven CLIL: Cybersecurity vocabulary learning with AI. Journal of Educational Technology and Online Learning, 8(3), 313-329. https://doi.org/10.31681/jetol.1685183