Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2023, Cilt: 13 Sayı: 3, 1482 - 1495, 01.09.2023
https://doi.org/10.21597/jist.1283491

Öz

Kaynakça

  • Albattah, W., Nawaz, M., Javed, A., Masood, M., & Albahli, S. (2022). A novel deep learning method for detection and classification of plant diseases. Complex & Intelligent Systems, 1–18.
  • Argüeso, D., Picon, A., Irusta, U., Medela, A., San-Emeterio, M. G., Bereciartua, A., & Alvarez-Gila, A. (2020). Few-Shot Learning approach for plant disease classification using images taken in the field. Computers and Electronics in Agriculture, 175, 105542.
  • Arnold, S. M. R., Mahajan, P., Datta, D., Bunner, I., & Zarkias, K. S. (2020). learn2learn: A Library for Meta-Learning Research. http://arxiv.org/abs/2008.12284
  • Bayat, S., & Işık, G. (2022). Recognition of Aras Bird Species From Their Voices With Deep Learning Methods. Journal of the Institute of Science and Technology, 12(3), 1250–1263.
  • Cao, Y., Xu, J., Lin, S., Wei, F., & Hu, H. (2019). Gcnet: Non-local networks meet squeeze-excitation networks and beyond. Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, 0.
  • Chen, L., Cui, X., & Li, W. (2021). Meta-learning for few-shot plant disease detection. Foods, 10(10), 2441.
  • Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). Imagenet: A large-scale hierarchical image database. 2009 IEEE Conference on Computer Vision and Pattern Recognition, 248–255.
  • Dumoulin, V., Houlsby, N., Evci, U., Zhai, X., Goroshin, R., Gelly, S., & Larochelle, H. (2021). Comparing transfer and meta learning approaches on a unified few-shot classification benchmark. ArXiv Preprint ArXiv:2104.02638.
  • Finn, C., Abbeel, P., & Levine, S. (2017). Model-agnostic meta-learning for fast adaptation of deep networks. International Conference on Machine Learning, 1126–1135.
  • Gao, Z., Xie, J., Wang, Q., & Li, P. (2019). Global second-order pooling convolutional networks. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 3024–3033.
  • Gündüz, M. Ş., & Işık, G. (2023). A new YOLO-based method for social distancing from real-time videos. Neural Computing and Applications, 1–11.
  • Guo, M.-H., Xu, T.-X., Liu, J.-J., Liu, Z.-N., Jiang, P.-T., Mu, T.-J., Zhang, S.-H., Martin, R. R., Cheng, M.-M., & Hu, S.-M. (2022). Attention mechanisms in computer vision: A survey. Computational Visual Media, 8(3), 331–368.
  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 770–778.
  • Hu, J., Shen, L., & Sun, G. (2018). Squeeze-and-excitation networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 7132–7141.
  • Karaman, A., Pacal, I., Basturk, A., Akay, B., Nalbantoglu, U., Coskun, S., Sahin, O., & Karaboga, D. (2023). Robust real-time polyp detection system design based on YOLO algorithms by optimizing activation functions and hyper-parameters with artificial bee colony (ABC). Expert Systems with Applications, 221, 119741.
  • Kaya, A., Keceli, A. S., Catal, C., Yalic, H. Y., Temucin, H., & Tekinerdogan, B. (2019). Analysis of transfer learning for deep neural network based plant classification models. Computers and Electronics in Agriculture, 158, 20–29.
  • Keceli, A. S., Kaya, A., Catal, C., & Tekinerdogan, B. (2022). Deep learning-based multi-task prediction system for plant disease and species detection. Ecological Informatics, 69, 101679.
  • Li, Y., & Chao, X. (2021). Semi-supervised few-shot learning approach for plant diseases recognition. Plant Methods, 17, 1–10.
  • Lin, H., Tse, R., Tang, S.-K., Qiang, Z., & Pau, G. (2022a). Few-shot learning approach with multi-scale feature fusion and attention for plant disease recognition. Frontiers in Plant Science, 13.
  • Lin, H., Tse, R., Tang, S.-K., Qiang, Z., & Pau, G. (2022b). Few-Shot Learning for Plant-Disease Recognition in the Frequency Domain. Plants, 11(21), 2814.
  • Liu, X., Zhang, F., Hou, Z., Mian, L., Wang, Z., Zhang, J., & Tang, J. (2021). Self-supervised learning: Generative or contrastive. IEEE Transactions on Knowledge and Data Engineering, 35(1), 857–876.
  • Mohanty, S. P., Hughes, D. P., & Salathé, M. (2016). Using deep learning for image-based plant disease detection. Frontiers in Plant Science, 7, 1419.
  • Munkhdalai, T., & Yu, H. (2017). Meta networks. International Conference on Machine Learning, 2554–2563.
  • Nichol, A., Achiam, J., & Schulman, J. (2018). On first-order meta-learning algorithms. ArXiv Preprint ArXiv:1803.02999.
  • Pacal, I. (2022). Deep Learning Approaches for Classification of Breast Cancer in Ultrasound (US) Images. Journal of the Institute of Science and Technology, 12(4), 1917–1927.
  • Patricio, D. I., & Rieder, R. (2018). Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Computers and Electronics in Agriculture, 153, 69–81.
  • Raghu, A., Raghu, M., Bengio, S., & Vinyals, O. (2019). Rapid learning or feature reuse? towards understanding the effectiveness of maml. ArXiv Preprint ArXiv:1909.09157.
  • Shen, T., Zhou, T., Long, G., Jiang, J., Wang, S., & Zhang, C. (2018). Reinforced self-attention network: a hybrid of hard and soft attention for sequence modeling. ArXiv Preprint ArXiv:1801.10296.
  • Snell, J., Swersky, K., & Zemel, R. (2017). Prototypical networks for few-shot learning. Advances in Neural Information Processing Systems, 30.
  • Sun, Q., Liu, Y., Chua, T.-S., & Schiele, B. (2019). Meta-transfer learning for few-shot learning. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 403–412.
  • Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., & Wojna, Z. (2016). Rethinking the inception architecture for computer vision. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2818–2826.
  • Vinyals, O., Blundell, C., Lillicrap, T., Wierstra, D., & others. (2016). Matching networks for one shot learning. Advances in Neural Information Processing Systems, 29.
  • Wang, Q., Wu, B., Zhu, P., Li, P., Zuo, W., & Hu, Q. (2020). ECA-Net: Efficient channel attention for deep convolutional neural networks. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 11534–11542.
  • Wang, S., Li, C., Wang, R., Liu, Z., Wang, M., Tan, H., Wu, Y., Liu, X., Sun, H., Yang, R., & others. (2021). Annotation-efficient deep learning for automatic medical image segmentation. Nature Communications, 12(1), 5915.
  • Wang, W., Song, H., Zhao, S., Shen, J., Zhao, S., Hoi, S. C. H., & Ling, H. (2019). Learning unsupervised video object segmentation through visual attention. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 3064–3074.
  • Woo, S., Park, J., Lee, J.-Y., & Kweon, I. S. (2018). Cbam: Convolutional block attention module. Proceedings of the European Conference on Computer Vision (ECCV), 3–19.
  • Yang, J., Guo, X., Li, Y., Marinello, F., Ercisli, S., & Zhang, Z. (2022). A survey of few-shot learning in smart agriculture: developments, applications, and challenges. Plant Methods, 18(1), 1–12.

Improving Plant Disease Recognition Through Gradient-Based Few-shot Learning with Attention Mechanisms

Yıl 2023, Cilt: 13 Sayı: 3, 1482 - 1495, 01.09.2023
https://doi.org/10.21597/jist.1283491

Öz

This study investigates the use of few-shot learning algorithms to improve classification performance in situations where traditional deep learning methods fail due to a lack of training data. Specifically, we propose a few-shot learning approach using the Almost No Inner Loop (ANIL) algorithm and attention modules to classify tomato diseases in the Plant Village dataset. The attended features obtained from the five separate attention modules are classified using a Multi Layer Perceptron (MLP) classifier, and the soft voting method is used to weigh the classification scores from each classifier. The results demonstrate that our proposed approach achieves state-of-the-art accuracy rates of 97.05% and 97.66% for 10-shot and 20-shot classification, respectively. Our approach demonstrates the potential for incorporating attention mechanisms in feature extraction processes and suggests new avenues for research in few-shot learning methods.

Kaynakça

  • Albattah, W., Nawaz, M., Javed, A., Masood, M., & Albahli, S. (2022). A novel deep learning method for detection and classification of plant diseases. Complex & Intelligent Systems, 1–18.
  • Argüeso, D., Picon, A., Irusta, U., Medela, A., San-Emeterio, M. G., Bereciartua, A., & Alvarez-Gila, A. (2020). Few-Shot Learning approach for plant disease classification using images taken in the field. Computers and Electronics in Agriculture, 175, 105542.
  • Arnold, S. M. R., Mahajan, P., Datta, D., Bunner, I., & Zarkias, K. S. (2020). learn2learn: A Library for Meta-Learning Research. http://arxiv.org/abs/2008.12284
  • Bayat, S., & Işık, G. (2022). Recognition of Aras Bird Species From Their Voices With Deep Learning Methods. Journal of the Institute of Science and Technology, 12(3), 1250–1263.
  • Cao, Y., Xu, J., Lin, S., Wei, F., & Hu, H. (2019). Gcnet: Non-local networks meet squeeze-excitation networks and beyond. Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, 0.
  • Chen, L., Cui, X., & Li, W. (2021). Meta-learning for few-shot plant disease detection. Foods, 10(10), 2441.
  • Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., & Fei-Fei, L. (2009). Imagenet: A large-scale hierarchical image database. 2009 IEEE Conference on Computer Vision and Pattern Recognition, 248–255.
  • Dumoulin, V., Houlsby, N., Evci, U., Zhai, X., Goroshin, R., Gelly, S., & Larochelle, H. (2021). Comparing transfer and meta learning approaches on a unified few-shot classification benchmark. ArXiv Preprint ArXiv:2104.02638.
  • Finn, C., Abbeel, P., & Levine, S. (2017). Model-agnostic meta-learning for fast adaptation of deep networks. International Conference on Machine Learning, 1126–1135.
  • Gao, Z., Xie, J., Wang, Q., & Li, P. (2019). Global second-order pooling convolutional networks. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 3024–3033.
  • Gündüz, M. Ş., & Işık, G. (2023). A new YOLO-based method for social distancing from real-time videos. Neural Computing and Applications, 1–11.
  • Guo, M.-H., Xu, T.-X., Liu, J.-J., Liu, Z.-N., Jiang, P.-T., Mu, T.-J., Zhang, S.-H., Martin, R. R., Cheng, M.-M., & Hu, S.-M. (2022). Attention mechanisms in computer vision: A survey. Computational Visual Media, 8(3), 331–368.
  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 770–778.
  • Hu, J., Shen, L., & Sun, G. (2018). Squeeze-and-excitation networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 7132–7141.
  • Karaman, A., Pacal, I., Basturk, A., Akay, B., Nalbantoglu, U., Coskun, S., Sahin, O., & Karaboga, D. (2023). Robust real-time polyp detection system design based on YOLO algorithms by optimizing activation functions and hyper-parameters with artificial bee colony (ABC). Expert Systems with Applications, 221, 119741.
  • Kaya, A., Keceli, A. S., Catal, C., Yalic, H. Y., Temucin, H., & Tekinerdogan, B. (2019). Analysis of transfer learning for deep neural network based plant classification models. Computers and Electronics in Agriculture, 158, 20–29.
  • Keceli, A. S., Kaya, A., Catal, C., & Tekinerdogan, B. (2022). Deep learning-based multi-task prediction system for plant disease and species detection. Ecological Informatics, 69, 101679.
  • Li, Y., & Chao, X. (2021). Semi-supervised few-shot learning approach for plant diseases recognition. Plant Methods, 17, 1–10.
  • Lin, H., Tse, R., Tang, S.-K., Qiang, Z., & Pau, G. (2022a). Few-shot learning approach with multi-scale feature fusion and attention for plant disease recognition. Frontiers in Plant Science, 13.
  • Lin, H., Tse, R., Tang, S.-K., Qiang, Z., & Pau, G. (2022b). Few-Shot Learning for Plant-Disease Recognition in the Frequency Domain. Plants, 11(21), 2814.
  • Liu, X., Zhang, F., Hou, Z., Mian, L., Wang, Z., Zhang, J., & Tang, J. (2021). Self-supervised learning: Generative or contrastive. IEEE Transactions on Knowledge and Data Engineering, 35(1), 857–876.
  • Mohanty, S. P., Hughes, D. P., & Salathé, M. (2016). Using deep learning for image-based plant disease detection. Frontiers in Plant Science, 7, 1419.
  • Munkhdalai, T., & Yu, H. (2017). Meta networks. International Conference on Machine Learning, 2554–2563.
  • Nichol, A., Achiam, J., & Schulman, J. (2018). On first-order meta-learning algorithms. ArXiv Preprint ArXiv:1803.02999.
  • Pacal, I. (2022). Deep Learning Approaches for Classification of Breast Cancer in Ultrasound (US) Images. Journal of the Institute of Science and Technology, 12(4), 1917–1927.
  • Patricio, D. I., & Rieder, R. (2018). Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Computers and Electronics in Agriculture, 153, 69–81.
  • Raghu, A., Raghu, M., Bengio, S., & Vinyals, O. (2019). Rapid learning or feature reuse? towards understanding the effectiveness of maml. ArXiv Preprint ArXiv:1909.09157.
  • Shen, T., Zhou, T., Long, G., Jiang, J., Wang, S., & Zhang, C. (2018). Reinforced self-attention network: a hybrid of hard and soft attention for sequence modeling. ArXiv Preprint ArXiv:1801.10296.
  • Snell, J., Swersky, K., & Zemel, R. (2017). Prototypical networks for few-shot learning. Advances in Neural Information Processing Systems, 30.
  • Sun, Q., Liu, Y., Chua, T.-S., & Schiele, B. (2019). Meta-transfer learning for few-shot learning. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 403–412.
  • Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., & Wojna, Z. (2016). Rethinking the inception architecture for computer vision. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2818–2826.
  • Vinyals, O., Blundell, C., Lillicrap, T., Wierstra, D., & others. (2016). Matching networks for one shot learning. Advances in Neural Information Processing Systems, 29.
  • Wang, Q., Wu, B., Zhu, P., Li, P., Zuo, W., & Hu, Q. (2020). ECA-Net: Efficient channel attention for deep convolutional neural networks. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 11534–11542.
  • Wang, S., Li, C., Wang, R., Liu, Z., Wang, M., Tan, H., Wu, Y., Liu, X., Sun, H., Yang, R., & others. (2021). Annotation-efficient deep learning for automatic medical image segmentation. Nature Communications, 12(1), 5915.
  • Wang, W., Song, H., Zhao, S., Shen, J., Zhao, S., Hoi, S. C. H., & Ling, H. (2019). Learning unsupervised video object segmentation through visual attention. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 3064–3074.
  • Woo, S., Park, J., Lee, J.-Y., & Kweon, I. S. (2018). Cbam: Convolutional block attention module. Proceedings of the European Conference on Computer Vision (ECCV), 3–19.
  • Yang, J., Guo, X., Li, Y., Marinello, F., Ercisli, S., & Zhang, Z. (2022). A survey of few-shot learning in smart agriculture: developments, applications, and challenges. Plant Methods, 18(1), 1–12.
Toplam 37 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Bilgisayar Yazılımı
Bölüm Bilgisayar Mühendisliği / Computer Engineering
Yazarlar

Gültekin Işık 0000-0003-3037-5586

Erken Görünüm Tarihi 29 Ağustos 2023
Yayımlanma Tarihi 1 Eylül 2023
Gönderilme Tarihi 14 Nisan 2023
Kabul Tarihi 5 Mayıs 2023
Yayımlandığı Sayı Yıl 2023 Cilt: 13 Sayı: 3

Kaynak Göster

APA Işık, G. (2023). Improving Plant Disease Recognition Through Gradient-Based Few-shot Learning with Attention Mechanisms. Journal of the Institute of Science and Technology, 13(3), 1482-1495. https://doi.org/10.21597/jist.1283491
AMA Işık G. Improving Plant Disease Recognition Through Gradient-Based Few-shot Learning with Attention Mechanisms. Iğdır Üniv. Fen Bil Enst. Der. Eylül 2023;13(3):1482-1495. doi:10.21597/jist.1283491
Chicago Işık, Gültekin. “Improving Plant Disease Recognition Through Gradient-Based Few-Shot Learning With Attention Mechanisms”. Journal of the Institute of Science and Technology 13, sy. 3 (Eylül 2023): 1482-95. https://doi.org/10.21597/jist.1283491.
EndNote Işık G (01 Eylül 2023) Improving Plant Disease Recognition Through Gradient-Based Few-shot Learning with Attention Mechanisms. Journal of the Institute of Science and Technology 13 3 1482–1495.
IEEE G. Işık, “Improving Plant Disease Recognition Through Gradient-Based Few-shot Learning with Attention Mechanisms”, Iğdır Üniv. Fen Bil Enst. Der., c. 13, sy. 3, ss. 1482–1495, 2023, doi: 10.21597/jist.1283491.
ISNAD Işık, Gültekin. “Improving Plant Disease Recognition Through Gradient-Based Few-Shot Learning With Attention Mechanisms”. Journal of the Institute of Science and Technology 13/3 (Eylül 2023), 1482-1495. https://doi.org/10.21597/jist.1283491.
JAMA Işık G. Improving Plant Disease Recognition Through Gradient-Based Few-shot Learning with Attention Mechanisms. Iğdır Üniv. Fen Bil Enst. Der. 2023;13:1482–1495.
MLA Işık, Gültekin. “Improving Plant Disease Recognition Through Gradient-Based Few-Shot Learning With Attention Mechanisms”. Journal of the Institute of Science and Technology, c. 13, sy. 3, 2023, ss. 1482-95, doi:10.21597/jist.1283491.
Vancouver Işık G. Improving Plant Disease Recognition Through Gradient-Based Few-shot Learning with Attention Mechanisms. Iğdır Üniv. Fen Bil Enst. Der. 2023;13(3):1482-95.