Research Article
BibTex RIS Cite

Türkçe Metin Madenciliği için Dikkat Mekanizması Tabanlı Derin Öğrenme Mimarilerinin Değerlendirilmesi

Year 2022, Issue: 34, 403 - 407, 31.03.2022
https://doi.org/10.31590/ejosat.1082379

Abstract

Son yıllarda, hesaplama maliyetlerindeki düşüşler ve veri hacmindeki artışlar ile, büyük veri üzerinde etkin öğrenme modelleri oluşturmaya yönelik derin sinir ağı modelleri alanında önemli başarımlar elde edilmeye başlanmıştır. Derin sinir ağı mimarileri, duygu analizi, metin sınıflandırma, makine çevirisi gibi doğal dil işleme problemlerinin yanı sıra, konuşma tanıma, bilgisayarla görme gibi birçok alanda başarıyla uygulanmaktadır. Metin sınıflandırma problemlerinde, tekrarlayan sinir ağları, uzun kısa süreli bellek mimarisi ve geçitli tekrarlayan birim gibi mimariler etkin bir biçimde kullanılmaktadır. Tekrarlayan sinir ağları (RNN), uzun süreli bağımlılıkları öğrenirken verimli değildir. Bu mimaride, belirteçler sırayla işlenir ve her belirteçten sonra görülen verileri temsil eden bir durum vektörü elde edilir. RNN tabanlı mimariler, kaybolan gradyan sorunu nedeniyle, uzun tümcelerin sonundaki durumun genellikle erken gözlemlenen belirteçlere ilişkin bilgi içermemesi sorununa neden olur. Dikkat mekanizması, bir modelin tümcedeki daha önceki bir noktanın durumuna doğrudan bakmasını ve bundan yararlanabilmesini olanaklı kılar. Böylelikle, tüm durumlara erişilebilir ve mevcut belirteçle öğrenilmiş bazı ilgi ölçütlerine göre, çok uzaktaki ilgili belirteçler hakkında daha net bilgi sağlanarak, daha etkin bir metin modelleme gerçekleştirilebilir. Dikkat mekanizması, makine çevirisi, metin özetleme başta olmak üzere birçok doğal dil işleme görevinde başarıyla uygulanmaktadır. Bu çalışmada, Türkçe duygu analizi için dikkat mekanizması tabanlı derin öğrenme mimarilerinin başarımları karşılaştırmalı olarak incelenmektedir. Deneysel analizlerde, tekrarlayan sinir ağı, uzun kısa süreli bellek mimarisi (LSTM), geçitli tekrarlayan birim mimarisi (GRU), küresel dikkat mekanizması tabanlı LSTM, kendine dikkat mekanizması tabanlı LSTM, hiyerarşik dikkat mekanizması tabanlı LSTM, küresel dikkat mekanizması tabanlı GRU, kendine dikkat mekanizması tabanlı GRU ve hiyerarşik dikkat mekanizması tabanlı GRU değerlendirilmiştir. Dikkat mekanizması tabanlı mimarilerin, duygu analizi için doğru sınıflandırma oranını önemli ölçüde artırdığı gözlenmektedir.

Supporting Institution

İzmir Katip Çelebi Üniversitesi

Project Number

2022-GAP-MÜMF-0030

Thanks

Bu araştırma, İzmir Kâtip Çelebi Üniversitesi Bilimsel Araştırma Koordinasyon birimi (BAP) tarafından desteklenmiştir (Proje no: 2022-GAP-MÜMF-0030).

References

  • Deng, L., & Liu, Y. (Eds.). (2018). Deep learning in natural language processing. Springer.
  • Onan, A., & Korukoğlu, S. (2017). A feature selection model based on genetic rank aggregation for text sentiment classification. Journal of Information Science, 43(1), 25-38.
  • Liu, B. (2012). Sentiment analysis and opinion mining. Synthesis lectures on human language technologies, 5(1), 1-167.
  • Tan, S., & Zhang, J. (2008). An empirical study of sentiment analysis for chinese documents. Expert Systems with applications, 34(4), 2622-2629.
  • Munkhdalai, T., & Yu, H. (2017, April). Neural tree indexers for text understanding. In Proceedings of the conference. Association for Computational Linguistics. Meeting (Vol. 1, p. 11). NIH Public Access.
  • Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., & Hovy, E. (2016, June). Hierarchical attention networks for document classification. In Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies (pp. 1480-1489).
  • Yin, Y., Song, Y., & Zhang, M. (2017, September). Document-level multi-aspect sentiment classification as machine comprehension. In Proceedings of the 2017 conference on empirical methods in natural language processing (pp. 2044-2054).
  • Lin, Z., Feng, M., Santos, C. N. D., Yu, M., Xiang, B., Zhou, B., & Bengio, Y. (2017). A structured self-attentive sentence embedding. arXiv preprint arXiv:1703.03130.
  • Chen, H., Sun, M., Tu, C., Lin, Y., & Liu, Z. (2016, November). Neural sentiment classification with user and product attention. In Proceedings of the 2016 conference on empirical methods in natural language processing (pp. 1650-1659).
  • Liu, G., & Guo, J. (2019). Bidirectional LSTM with attention mechanism and convolutional layer for text classification. Neurocomputing, 337, 325-338.
  • Fu, X., Yang, J., Li, J., Fang, M., & Wang, H. (2018). Lexicon-enhanced LSTM with attention for general sentiment analysis. IEEE Access, 6, 71884-71891.
  • Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013). Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781.
  • Bojanowski, P., Grave, E., Joulin, A., & Mikolov, T. (2017). Enriching word vectors with subword information. Transactions of the association for computational linguistics, 5, 135-146.
  • Zhang, L., Wang, S., & Liu, B. (2018). Deep learning for sentiment analysis: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 8(4), e1253.
  • ONAN, A. (2021). Sentiment analysis on massive open online course evaluations: a text mining and deep learning approach. Computer Applications in Engineering Education, 29(3), 572-589.
  • Zhang, L., Wang, S., & Liu, B. (2018). Deep learning for sentiment analysis: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 8(4), e1253.
  • Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
  • Bahdanau, D., Cho, K., & Bengio, Y. (2014). Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473.
  • Kardakis, S., Perikos, I., Grivokostopoulou, F., & Hatzilygeroudis, I. (2021). Examining attention mechanisms in deep learning models for sentiment analysis. Applied Sciences, 11(9), 3883.
  • Cheng, J.; Dong, L.; Lapata, M. Long short-term memory-networks for machine reading. arXiv 2016, arXiv:1601.06733.
  • Yang, Z.; Yang, D.; Dyer, C.; He, X.; Smola, A.; Hovy, E. Hierarchical attention networks for document classification. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, CA, USA, 12–17 June 2016; pp. 1480–1489.
  • Onan, A. (2021). Ensemble of classifiers and term weighting schemes for sentiment analysis in Turkish. Scientific Research Communications, 1(1).

Evaluation of Attention Mechanism-Based Deep Learning Architectures for Turkish Text Mining

Year 2022, Issue: 34, 403 - 407, 31.03.2022
https://doi.org/10.31590/ejosat.1082379

Abstract

In recent years, with the decrease in computational costs and the increase in data volume, significant achievements have been done in the field of deep neural network models for constructing effective learning models on big data. Deep neural network architectures have been successfully applied in many fields such as speech recognition, computer vision, as well as natural language processing problems such as sentiment analysis, text classification, and machine translation. In text classification problems, architectures such as recurrent neural networks, long-short-term memory architecture and gated recurrent unit are effectively used. Recurrent neural networks (RNN) are not efficient when learning long-term dependencies. In this architecture, tokens are processed sequentially, resulting in a state vector representing the data seen after each token. RNN-based architectures suffer from the problem that the state at the end of long sentences often does not contain information about early observed markers, due to the vanishing gradient problem. The attention mechanism makes it possible for a model to look directly at the state of an earlier point in the sentence and take advantage of it. Thus, all cases can be accessed and a more effective text modeling can be performed by providing clearer information about very distant related markers, according to some criteria of relevance learned with the current marker. The attention mechanism has been successfully applied in many natural language processing tasks, especially machine translation and text summarization. In this study, the performances of attention mechanism-based deep learning architectures for Turkish sentiment analysis are comparatively examined. In experimental analysis, recurrent neural network, long short-term memory architecture (LSTM), gated recurrent unit architecture (GRU), global attention mechanism-based LSTM, self-attention mechanism-based LSTM, hierarchical attention mechanism-based LSTM, global attention mechanism-based GRU, self-attention mechanism-based GRU and hierarchical attention mechanism-based GRU were evaluated. Attention mechanism-based architectures are observed to significantly increase the rate of correct classification for sentiment analysis.

Project Number

2022-GAP-MÜMF-0030

References

  • Deng, L., & Liu, Y. (Eds.). (2018). Deep learning in natural language processing. Springer.
  • Onan, A., & Korukoğlu, S. (2017). A feature selection model based on genetic rank aggregation for text sentiment classification. Journal of Information Science, 43(1), 25-38.
  • Liu, B. (2012). Sentiment analysis and opinion mining. Synthesis lectures on human language technologies, 5(1), 1-167.
  • Tan, S., & Zhang, J. (2008). An empirical study of sentiment analysis for chinese documents. Expert Systems with applications, 34(4), 2622-2629.
  • Munkhdalai, T., & Yu, H. (2017, April). Neural tree indexers for text understanding. In Proceedings of the conference. Association for Computational Linguistics. Meeting (Vol. 1, p. 11). NIH Public Access.
  • Yang, Z., Yang, D., Dyer, C., He, X., Smola, A., & Hovy, E. (2016, June). Hierarchical attention networks for document classification. In Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies (pp. 1480-1489).
  • Yin, Y., Song, Y., & Zhang, M. (2017, September). Document-level multi-aspect sentiment classification as machine comprehension. In Proceedings of the 2017 conference on empirical methods in natural language processing (pp. 2044-2054).
  • Lin, Z., Feng, M., Santos, C. N. D., Yu, M., Xiang, B., Zhou, B., & Bengio, Y. (2017). A structured self-attentive sentence embedding. arXiv preprint arXiv:1703.03130.
  • Chen, H., Sun, M., Tu, C., Lin, Y., & Liu, Z. (2016, November). Neural sentiment classification with user and product attention. In Proceedings of the 2016 conference on empirical methods in natural language processing (pp. 1650-1659).
  • Liu, G., & Guo, J. (2019). Bidirectional LSTM with attention mechanism and convolutional layer for text classification. Neurocomputing, 337, 325-338.
  • Fu, X., Yang, J., Li, J., Fang, M., & Wang, H. (2018). Lexicon-enhanced LSTM with attention for general sentiment analysis. IEEE Access, 6, 71884-71891.
  • Mikolov, T., Chen, K., Corrado, G., & Dean, J. (2013). Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781.
  • Bojanowski, P., Grave, E., Joulin, A., & Mikolov, T. (2017). Enriching word vectors with subword information. Transactions of the association for computational linguistics, 5, 135-146.
  • Zhang, L., Wang, S., & Liu, B. (2018). Deep learning for sentiment analysis: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 8(4), e1253.
  • ONAN, A. (2021). Sentiment analysis on massive open online course evaluations: a text mining and deep learning approach. Computer Applications in Engineering Education, 29(3), 572-589.
  • Zhang, L., Wang, S., & Liu, B. (2018). Deep learning for sentiment analysis: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 8(4), e1253.
  • Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.
  • Bahdanau, D., Cho, K., & Bengio, Y. (2014). Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473.
  • Kardakis, S., Perikos, I., Grivokostopoulou, F., & Hatzilygeroudis, I. (2021). Examining attention mechanisms in deep learning models for sentiment analysis. Applied Sciences, 11(9), 3883.
  • Cheng, J.; Dong, L.; Lapata, M. Long short-term memory-networks for machine reading. arXiv 2016, arXiv:1601.06733.
  • Yang, Z.; Yang, D.; Dyer, C.; He, X.; Smola, A.; Hovy, E. Hierarchical attention networks for document classification. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, CA, USA, 12–17 June 2016; pp. 1480–1489.
  • Onan, A. (2021). Ensemble of classifiers and term weighting schemes for sentiment analysis in Turkish. Scientific Research Communications, 1(1).
There are 22 citations in total.

Details

Primary Language Turkish
Subjects Engineering
Journal Section Articles
Authors

Aytuğ Onan 0000-0002-9434-5880

Project Number 2022-GAP-MÜMF-0030
Early Pub Date January 30, 2022
Publication Date March 31, 2022
Published in Issue Year 2022 Issue: 34

Cite

APA Onan, A. (2022). Türkçe Metin Madenciliği için Dikkat Mekanizması Tabanlı Derin Öğrenme Mimarilerinin Değerlendirilmesi. Avrupa Bilim Ve Teknoloji Dergisi(34), 403-407. https://doi.org/10.31590/ejosat.1082379