Sunn pest damage is one of the most crucial types of agricultural damage. Authorities and farmers are working together to find a cost-effective solution for separating the damaged crops from the healthy ones. This challenge can be tackled cost-effectively with emerging technology. Over time, the number of researchers focusing on this problem by using various machine learning algorithms and image processing techniques has increased. This paper presents an approach using a recurrent neural networks-based transformer to identify different varieties of wheat grain that have been sunn pest-damaged and healthy. First, wheat grains were separated from each other using YOLOv8. Then, the dataset was enriched with different data augmentation techniques, and data-efficient vision transformers were used to classify sunn pest-damaged and healthy grains. Conversely, a high accuracy score of 98.61% was achieved on the augmented dataset while surpassing the accuracy score of 93.36% in the raw dataset. This paper's contributions to literature can be divided into three categories. In contrast to the previous research, perfectly shaped, broken, and half-wheat grains are used to better fit findings in real-life environments such as factory production lines. Moreover, this study employs a combination of augmentation techniques, implying that two separate augmentation techniques, texture-based and one morphological, were applied to the same image. Finally, no study in the available literature uses a vision transformer to classify healthy and sunned pest-damaged wheat grains. That leads to using a data-efficient vision transformer algorithm and achieving a high accuracy score of 98.61%.
Primary Language | English |
---|---|
Subjects | Computer Vision, Image Processing, Deep Learning, Neural Networks |
Journal Section | Research Article |
Authors | |
Publication Date | December 31, 2024 |
Submission Date | July 8, 2024 |
Acceptance Date | October 27, 2024 |
Published in Issue | Year 2024 Volume: 10 Issue: 4 |
JARNAS is licensed under a Creative Commons Attribution-NonCommercial 4.0 International Licence (CC BY-NC).