Konferans Bildirisi
BibTex RIS Kaynak Göster

Ultrason Taramalarında YOLOv5 Ağı Kullanarak Anatomik Yapıların Tanınması ve Fetüs Hareketlerinin Tespiti

Yıl 2021, Sayı: 26 - Ejosat Özel Sayı 2021 (HORA), 208 - 216, 31.07.2021
https://doi.org/10.31590/ejosat.951786

Öz

Tıbbi görüntülerin ve videoların bilgisayar destekli algoritmalar ile analiz edilmesi, hastalıkların teşhis ve tedavisinde önemli faydalar sağlamaktadır. Özellikle son yıllarda, derin öğrenme algoritmalarındaki artan gelişmeler, medikal verilerin işlenmesinde hız, performans ve donanım ihtiyacı gibi konularda sürekli iyileşme sağlamıştır. İleri derece uzmanlık gerektirebilen medikal verilerin inceleme işlemlerinin derin öğrenme algoritmalarıyla yapılması, hekimlerin karar verme aşamasında ikincil bir araç olarak yaygın olarak kullanılmaya başlamıştır. Ultrason (US) videolarında fetüsün hareketlerinin izlenmesi ve anatomik yapıların tanınması bebek sağlığının değerlendirilmesinde önemli bir parametredir. Bu çalışmada, fetüs ultrasonundan fetüs anatomik yapıların tanınması ve hareketlerinin tespiti için YOLOv5 derin öğrenme ağı destekli bir yöntem önerilmektedir. Çalışmada öncelikle, 16-20 haftalık fetüs hareketlerini içeren videolardan bir veriseti oluşturulmuştur. Sonraki aşamada, etiketlenen veriler üzerinde deep-SORT algoritması kullanılarak, fetüse ait kafa, kol, kalp ve gövde kimliklendirilmiş ve takip edilmiştir. Çalışma kapsamında ultrason videolarında yürütülen deneysel çalışmalarda, YOLOv5 algoritması kullanılarak, kafa, gövde, kalp ve kol anatomik yapıları, sırasıyla %95.04, %94.42, %88.31 ve %83.23 F1 skoru ile tanınmıştır. Ayrıca, fetüse ait kafa, kalp ve gövdenin ultrasonik video hareketleri takip edilerek, hareketlerin yörüngeleri ve örüntüleri çıkarılmıştır. Böylece, iki boyutlu düzleme dönüştürülen hareket örüntülerinden fetüs hareketlerinin tespiti sağlanmıştır.

Kaynakça

  • Ahmed, M., & Noble, J. A. (2016). Fetal ultrasound image classification using a bag-of-words model trained on sonographers’ eye movements. Procedia Computer Science, 90, 157-162.
  • Bai, Y. (2016). Object tracking & fetal signal monitoring: Southern Illinois University at Carbondale.
  • Baumgartner, C. F., Kamnitsas, K., Matthew, J., Fletcher, T. P., Smith, S., Koch, L. M., Kainz, B., & Rueckert, D. (2017). SonoNet: real-time detection and localisation of fetal standard scan planes in freehand ultrasound. IEEE transactions on medical imaging, 36(11), 2204-2215.
  • Bewley, A., Ge, Z., Ott, L., Ramos, F., & Upcroft, B. (2016). Simple online and realtime tracking. Paper presented at the 2016 IEEE international conference on image processing (ICIP). pp. 3464-3468.
  • Carneiro, G., Georgescu, B., Good, S., & Comaniciu, D. (2008). Detection and measurement of fetal anatomies from ultrasound images using a constrained probabilistic boosting tree. IEEE transactions on medical imaging, 27(9), 1342-1355.
  • Deep-SORT. (2021). Deep-SORT Algorithm. Available online: https://github.com/nwojke/deep_sort
  • Deepika, P., Suresh, R., & Pabitha, P. (2021). Defending Against Child Death: Deep learning‐based diagnosis method for abnormal identification of fetus ultrasound Images. Computational Intelligence, 37(1), 128-154.
  • Fiorentino, M. C., Moccia, S., Capparuccini, M., Giamberini, S., & Frontoni, E. (2021). A regression framework to head-circumference delineation from US fetal images. Computer methods and programs in biomedicine, 198, 105771.
  • Gao, Y., Maraci, M. A., & Noble, J. A. (2016). Describing ultrasound video content using deep convolutional neural networks. Paper presented at the 2016 IEEE 13th International Symposium on Biomedical Imaging (ISBI). pp. 787-790.
  • Ishikawa, G., Xu, R., Ohya, J., & Iwata, H. (2019). Detecting a Fetus in Ultrasound Images using Grad CAM and Locating the Fetus in the Uterus. Paper presented at the ICPRAM. pp. 181-189.
  • Jocher, G. (2020). YOLOv5. Available online: https://github.com/ultralytics/yolov5
  • Kumar, A., Sridar, P., Quinton, A., Kumar, R. K., Feng, D., Nanan, R., & Kim, J. (2016). Plane identification in fetal ultrasound images using saliency maps and convolutional neural networks. Paper presented at the 2016 IEEE 13th International Symposium on Biomedical Imaging (ISBI). pp. 791-794.
  • Lei, B., Zhuo, L., Chen, S., Li, S., Ni, D., & Wang, T. (2014). Automatic recognition of fetal standard plane in ultrasound image. Paper presented at the 2014 IEEE 11th International Symposium on Biomedical Imaging (ISBI). pp. 85-88.
  • Liu, W., Wang, Z., Zhou, B., Yang, S., & Gong, Z. (2021). Real-time Signal Light Detection based on Yolov5 for Railway. Paper presented at the IOP Conference Series: Earth and Environmental Science. pp. 042069.
  • Looney, P., Stevenson, G. N., Nicolaides, K. H., Plasencia, W., Molloholli, M., Natsis, S., & Collins, S. L. (2017). Automatic 3D ultrasound segmentation of the first trimester placenta using deep learning. Paper presented at the 2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017). pp. 279-282.
  • Malathi, G., & Shanthi, V. (2009). Wavelet based features for ultrasound placenta images classification. Paper presented at the 2009 Second International Conference on Emerging Trends in Engineering & Technology. pp. 341-345.
  • Malta, A., Mendes, M., & Farinha, T. (2021). Augmented Reality Maintenance Assistant Using YOLOv5. Applied Sciences, 11(11), 4758.
  • Ravishankar, H., Prabhu, S. M., Vaidya, V., & Singhal, N. (2016). Hybrid approach for automatic segmentation of fetal abdomen from ultrasound images using deep learning. Paper presented at the 2016 IEEE 13th International Symposium on Biomedical Imaging (ISBI). pp. 779-782.
  • Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. Paper presented at the Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 779-788.
  • Salomon, L. J., Alfirevic, Z., Berghella, V., Bilardo, C., Hernandez‐Andrade, E., Johnsen, S., Kalache, K., Leung, K. Y., Malinger, G., & Munoz, H. (2011). Practice guidelines for performance of the routine mid‐trimester fetal ultrasound scan. Ultrasound in Obstetrics & Gynecology, 37(1), 116-126.
  • Sinclair, M., Baumgartner, C. F., Matthew, J., Bai, W., Martinez, J. C., Li, Y., Smith, S., Knight, C. L., Kainz, B., & Hajnal, J. (2018). Human-level performance on automatic head biometrics in fetal ultrasound using fully convolutional neural networks. Paper presented at the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 714-717.
  • Sobhaninia, Z., Rafiei, S., Emami, A., Karimi, N., Najarian, K., Samavi, S., & Soroushmehr, S. R. (2019). Fetal ultrasound image segmentation for measuring biometric parameters using multi-task deep learning. Paper presented at the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 6545-6548.
  • Soviany, P., & Ionescu, R. T. (2018). Optimizing the trade-off between single-stage and two-stage deep object detectors using image difficulty prediction. Paper presented at the 2018 20th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC). pp. 209-214.
  • van den Heuvel, T. L., Petros, H., Santini, S., de Korte, C. L., & van Ginneken, B. (2019). Automated fetal head detection and circumference estimation from free-hand ultrasound sweeps using deep learning in resource-limited countries. Ultrasound in medicine & biology, 45(3), 773-785.
  • Velazquez, M. D., & Rayburn, W. F. (2002). Antenatal evaluation of the fetus using fetal movement monitoring. Clinical obstetrics and gynecology, 45(4), 993-1004.
  • Wang, C.-Y., Liao, H.-Y. M., Wu, Y.-H., Chen, P.-Y., Hsieh, J.-W., & Yeh, I.-H. (2020). CSPNet: A new backbone that can enhance learning capability of CNN. Paper presented at the Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops. pp. 390-391.
  • Whitehead, C. L., Cohen, N., Visser, G. H., & Farine, D. (2020). Are increased fetal movements always reassuring? The Journal of Maternal-Fetal & Neonatal Medicine, 33(21), 3713-3718.
  • Wojke, N., Bewley, A., & Paulus, D. (2017). Simple online and realtime tracking with a deep association metric. Paper presented at the 2017 IEEE international conference on image processing (ICIP). pp. 3645-3649.
  • Wróbel, J., Kupka, T., Horoba, K., Matonia, A., Roj, D., & Jeżewski, J. (2014). Automated detection of fetal movements in Doppler ultrasound signals versus maternal perception. Journal of Medical Informatics & Technologies, 23.
  • Yan, B., Fan, P., Lei, X., Liu, Z., & Yang, F. (2021). A Real-Time Apple Targets Detection Method for Picking Robot Based on Improved YOLOv5. Remote Sensing, 13(9), 1619.
  • Yaqub, M., Napolitano, R., Ioannou, C., Papageorghiou, A., & Noble, J. A. (2012). Automatic detection of local fetal brain structures in ultrasound images. Paper presented at the 2012 9th IEEE International Symposium on Biomedical Imaging (ISBI). pp. 1555-1558.
  • You, J., Li, Q., Guo, Z., & Zhao, R. (2017). Smart fetal monitoring. Paper presented at the International Conference on Information Science and Applications. pp. 494-503.
  • Yu, Z., Ni, D., Chen, S., Li, S., Wang, T., & Lei, B. (2016). Fetal facial standard plane recognition via very deep convolutional networks. Paper presented at the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 627-630.

Fetal Movement Detection and Anatomical Plane Recognition using YOLOv5 Network in Ultrasound Scans

Yıl 2021, Sayı: 26 - Ejosat Özel Sayı 2021 (HORA), 208 - 216, 31.07.2021
https://doi.org/10.31590/ejosat.951786

Öz

Analyzing medical images and videos with computer-aided algorithms provides important benefits in the diagnosis and treatment of diseases. Especially in recent years, the increasing developments in deep learning algorithms have provided continuous improvement in subjects such as speed, performance and hardware need in the processing of medical data. Examination of medical data, which may require advanced expertise, using deep learning algorithms has begun to be widely used as a secondary tool in the decision-making process of physicians. Tracking the movements of the fetus and recognizing its planes in ultrasound (US) videos is an important parameter in evaluating the health of the baby. In this study, a YOLOv5 deep learning network based method is proposed to identify fetal anatomical planes from fetal ultrasound and to detect their movements. First of all, a dataset of videos containing 16-20 weeks of fetal movements is created in the study. In the next step, the fetal head, arm, heart and body are identified and tracking using the deep-SORT algorithm on the labeled data. In the experimental studies conducted on ultrasound videos within the scope of the study, using the YOLOv5 algorithm, head, body, heart and arm are recognized with 95.04%, 94.42%, 88.31% and 83.23% F1-score, respectively. In addition, ultrasonic video movements of the head, heart and body of the fetus are followed and the trajectories and patterns of the movements are extracted. Thus, the detection of fetal movements from the movement patterns transformed into a two-dimensional plane is achieved.

Teşekkür

The authors of the study thank Evliya Çelebi Training and Research Hospital of Kütahya Health Sciences University for providing the fetal US dataset. We would also like to express our endless gratitude to Professor Huiyu Zhou from the University of Leicester, who shared his experiences and contributed to this study.

Kaynakça

  • Ahmed, M., & Noble, J. A. (2016). Fetal ultrasound image classification using a bag-of-words model trained on sonographers’ eye movements. Procedia Computer Science, 90, 157-162.
  • Bai, Y. (2016). Object tracking & fetal signal monitoring: Southern Illinois University at Carbondale.
  • Baumgartner, C. F., Kamnitsas, K., Matthew, J., Fletcher, T. P., Smith, S., Koch, L. M., Kainz, B., & Rueckert, D. (2017). SonoNet: real-time detection and localisation of fetal standard scan planes in freehand ultrasound. IEEE transactions on medical imaging, 36(11), 2204-2215.
  • Bewley, A., Ge, Z., Ott, L., Ramos, F., & Upcroft, B. (2016). Simple online and realtime tracking. Paper presented at the 2016 IEEE international conference on image processing (ICIP). pp. 3464-3468.
  • Carneiro, G., Georgescu, B., Good, S., & Comaniciu, D. (2008). Detection and measurement of fetal anatomies from ultrasound images using a constrained probabilistic boosting tree. IEEE transactions on medical imaging, 27(9), 1342-1355.
  • Deep-SORT. (2021). Deep-SORT Algorithm. Available online: https://github.com/nwojke/deep_sort
  • Deepika, P., Suresh, R., & Pabitha, P. (2021). Defending Against Child Death: Deep learning‐based diagnosis method for abnormal identification of fetus ultrasound Images. Computational Intelligence, 37(1), 128-154.
  • Fiorentino, M. C., Moccia, S., Capparuccini, M., Giamberini, S., & Frontoni, E. (2021). A regression framework to head-circumference delineation from US fetal images. Computer methods and programs in biomedicine, 198, 105771.
  • Gao, Y., Maraci, M. A., & Noble, J. A. (2016). Describing ultrasound video content using deep convolutional neural networks. Paper presented at the 2016 IEEE 13th International Symposium on Biomedical Imaging (ISBI). pp. 787-790.
  • Ishikawa, G., Xu, R., Ohya, J., & Iwata, H. (2019). Detecting a Fetus in Ultrasound Images using Grad CAM and Locating the Fetus in the Uterus. Paper presented at the ICPRAM. pp. 181-189.
  • Jocher, G. (2020). YOLOv5. Available online: https://github.com/ultralytics/yolov5
  • Kumar, A., Sridar, P., Quinton, A., Kumar, R. K., Feng, D., Nanan, R., & Kim, J. (2016). Plane identification in fetal ultrasound images using saliency maps and convolutional neural networks. Paper presented at the 2016 IEEE 13th International Symposium on Biomedical Imaging (ISBI). pp. 791-794.
  • Lei, B., Zhuo, L., Chen, S., Li, S., Ni, D., & Wang, T. (2014). Automatic recognition of fetal standard plane in ultrasound image. Paper presented at the 2014 IEEE 11th International Symposium on Biomedical Imaging (ISBI). pp. 85-88.
  • Liu, W., Wang, Z., Zhou, B., Yang, S., & Gong, Z. (2021). Real-time Signal Light Detection based on Yolov5 for Railway. Paper presented at the IOP Conference Series: Earth and Environmental Science. pp. 042069.
  • Looney, P., Stevenson, G. N., Nicolaides, K. H., Plasencia, W., Molloholli, M., Natsis, S., & Collins, S. L. (2017). Automatic 3D ultrasound segmentation of the first trimester placenta using deep learning. Paper presented at the 2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017). pp. 279-282.
  • Malathi, G., & Shanthi, V. (2009). Wavelet based features for ultrasound placenta images classification. Paper presented at the 2009 Second International Conference on Emerging Trends in Engineering & Technology. pp. 341-345.
  • Malta, A., Mendes, M., & Farinha, T. (2021). Augmented Reality Maintenance Assistant Using YOLOv5. Applied Sciences, 11(11), 4758.
  • Ravishankar, H., Prabhu, S. M., Vaidya, V., & Singhal, N. (2016). Hybrid approach for automatic segmentation of fetal abdomen from ultrasound images using deep learning. Paper presented at the 2016 IEEE 13th International Symposium on Biomedical Imaging (ISBI). pp. 779-782.
  • Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. Paper presented at the Proceedings of the IEEE conference on computer vision and pattern recognition. pp. 779-788.
  • Salomon, L. J., Alfirevic, Z., Berghella, V., Bilardo, C., Hernandez‐Andrade, E., Johnsen, S., Kalache, K., Leung, K. Y., Malinger, G., & Munoz, H. (2011). Practice guidelines for performance of the routine mid‐trimester fetal ultrasound scan. Ultrasound in Obstetrics & Gynecology, 37(1), 116-126.
  • Sinclair, M., Baumgartner, C. F., Matthew, J., Bai, W., Martinez, J. C., Li, Y., Smith, S., Knight, C. L., Kainz, B., & Hajnal, J. (2018). Human-level performance on automatic head biometrics in fetal ultrasound using fully convolutional neural networks. Paper presented at the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 714-717.
  • Sobhaninia, Z., Rafiei, S., Emami, A., Karimi, N., Najarian, K., Samavi, S., & Soroushmehr, S. R. (2019). Fetal ultrasound image segmentation for measuring biometric parameters using multi-task deep learning. Paper presented at the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 6545-6548.
  • Soviany, P., & Ionescu, R. T. (2018). Optimizing the trade-off between single-stage and two-stage deep object detectors using image difficulty prediction. Paper presented at the 2018 20th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC). pp. 209-214.
  • van den Heuvel, T. L., Petros, H., Santini, S., de Korte, C. L., & van Ginneken, B. (2019). Automated fetal head detection and circumference estimation from free-hand ultrasound sweeps using deep learning in resource-limited countries. Ultrasound in medicine & biology, 45(3), 773-785.
  • Velazquez, M. D., & Rayburn, W. F. (2002). Antenatal evaluation of the fetus using fetal movement monitoring. Clinical obstetrics and gynecology, 45(4), 993-1004.
  • Wang, C.-Y., Liao, H.-Y. M., Wu, Y.-H., Chen, P.-Y., Hsieh, J.-W., & Yeh, I.-H. (2020). CSPNet: A new backbone that can enhance learning capability of CNN. Paper presented at the Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops. pp. 390-391.
  • Whitehead, C. L., Cohen, N., Visser, G. H., & Farine, D. (2020). Are increased fetal movements always reassuring? The Journal of Maternal-Fetal & Neonatal Medicine, 33(21), 3713-3718.
  • Wojke, N., Bewley, A., & Paulus, D. (2017). Simple online and realtime tracking with a deep association metric. Paper presented at the 2017 IEEE international conference on image processing (ICIP). pp. 3645-3649.
  • Wróbel, J., Kupka, T., Horoba, K., Matonia, A., Roj, D., & Jeżewski, J. (2014). Automated detection of fetal movements in Doppler ultrasound signals versus maternal perception. Journal of Medical Informatics & Technologies, 23.
  • Yan, B., Fan, P., Lei, X., Liu, Z., & Yang, F. (2021). A Real-Time Apple Targets Detection Method for Picking Robot Based on Improved YOLOv5. Remote Sensing, 13(9), 1619.
  • Yaqub, M., Napolitano, R., Ioannou, C., Papageorghiou, A., & Noble, J. A. (2012). Automatic detection of local fetal brain structures in ultrasound images. Paper presented at the 2012 9th IEEE International Symposium on Biomedical Imaging (ISBI). pp. 1555-1558.
  • You, J., Li, Q., Guo, Z., & Zhao, R. (2017). Smart fetal monitoring. Paper presented at the International Conference on Information Science and Applications. pp. 494-503.
  • Yu, Z., Ni, D., Chen, S., Li, S., Wang, T., & Lei, B. (2016). Fetal facial standard plane recognition via very deep convolutional networks. Paper presented at the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). pp. 627-630.
Toplam 33 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Mühendislik
Bölüm Makaleler
Yazarlar

Emre Dandıl 0000-0001-6559-1399

Musa Turkan 0000-0002-4370-7474

Furkan Ertürk Urfalı 0000-0002-4875-7761

İsmail Biyik 0000-0001-6111-9302

Mehmet Korkmaz 0000-0001-6234-9484

Yayımlanma Tarihi 31 Temmuz 2021
Yayımlandığı Sayı Yıl 2021 Sayı: 26 - Ejosat Özel Sayı 2021 (HORA)

Kaynak Göster

APA Dandıl, E., Turkan, M., Urfalı, F. E., Biyik, İ., vd. (2021). Fetal Movement Detection and Anatomical Plane Recognition using YOLOv5 Network in Ultrasound Scans. Avrupa Bilim Ve Teknoloji Dergisi(26), 208-216. https://doi.org/10.31590/ejosat.951786