Research Article
BibTex RIS Cite

Otonom yüzey araçları için radar görüntüsü tabanlı nesne tespiti ve takibi

Year 2025, Issue: Advanced Online Publication, 1 - 13
https://doi.org/10.52998/trjmms.1743670

Abstract

Bu çalışma, otonom su üstü araçları için tasarlanmış modüler ve düşük kaynak tüketen bir radar görüntüsü tabanlı nesne tespiti ve çoklu hedef izleme çerçevesi sunar. Tespit modülü, ticari deniz radar ekranlarında genellikle beyaz renkle gösterilen gemi konum işaretçisi ve özellikle sarı ile kodlanmış seyir hedeflerini tespit amacıyla HSV (Hue, Saturation, Value) renk uzayında bölütleme yöntemini kullanmaktadır. Değişken deniz koşulları altında algılama kararlılığını artırmak için, ön işleme aşamasında Gauss yumuşatma, morfolojik işlemler ve alan tabanlı filtreleme uygulanmaktadır. Dinamik çoklu hedef izleme modülü, hedeflere kalıcı kimlikler atamak için hız ve yön farkındalığına sahip kimlik geri dönüşüm mekanizması ile güçlendirilmiş Öklidyen mesafe tabanlı eşleme stratejisi kullanmaktadır. Bu yaklaşım, geçici kayıplar veya ani manevralar sırasında meydana gelen kimlik parçalanmasını azaltmaktadır. Ayrıca sistem, her hedefin gemiye göreceli hızını tahmin ederek, yaklaşma veya uzaklaşma gibi davranış örüntülerinin çıkarımını mümkün kılar. 7.200 radar karesi üzerinde gerçekleştirilen değerlendirmede, önerilen tespit yöntemi %98,7 doğru pozitif oranına ulaşarak alan tabanlı temel yaklaşıma kıyasla %4,4’lük bir iyileşme sağlamış; yanlış pozitif oranını ise %59 oranında azaltmıştır. Bu sonuçlar, sistemin gürültü ve parazit içeren radar verileri altında hedef ayırt etme konusunda yüksek sağlamlığa sahip olduğunu göstermektedir. Önerilen izleme sistemi ayrıca, sadece konumsal bilgiye dayanan temel yaklaşıma göre kimlik değiştirme sayısını %75, toplam üretilen kimlik sayısını ise %58 oranında azaltarak takip sürekliliği ve kimlik kararlılığı açısından önemli kazanımlar sağlamıştır. Ortalama 240 ± 17 milisaniyelik çerçeve başı işlem süresi, sistemin gerçek zamanlı gömülü denizcilik uygulamalarında kullanılabilirliğini doğrulamaktadır.

References

  • Barekar, P.V., Singh, K.R. (2024). Analysis of different noise filtering techniques for object detection and tracking from video with varying illumination. Journal of Statistics and Management Systems, 27(2): 303-313. doi: 10.47974/JSMS-1256.
  • Cheng, Y., Xu, H., Liu, Y. (2021). Robust small object detection on the water surface through fusion of camera and millimeter wave radar. IEEE/CVF International Conference on Computer Vision (ICCV), 2021, s. 15263-15272, Montreal.
  • Jiang, Y., Dong, L., Liang, J. (2022). Image enhancement of maritime infrared targets based on scene discrimination. Sensors, 22(15): 5873. doi: 10.3390/s22155873.
  • Kim, H., Kim, D., Lee, S.M. (2023). Marine object segmentation and tracking by learning marine radar images for autonomous surface vehicles. IEEE Sensors Journal, 23(9): 10062-10070. doi: 10.1109/JSEN.2023.3259471.
  • Kim, K., Kim, J., Kim, J. (2021). Robust data association for multi-object detection in maritime environments using camera and radar measurements. IEEE Robotics and Automation Letters, 6(3): 5865-5872. doi: 10.1109/LRA.2021.3084891.
  • Kristan, M., Kenk, V.S., Kovačič, S., Perš, J. (2015). Fast image-based obstacle detection from unmanned surface vehicles. IEEE transactions on cybernetics, 46(3): 641-654. doi: 10.1109/TCYB.2015.2412251.
  • Lee, M.F.R., Lin, C.Y. (2022). Object tracking for an autonomous unmanned surface vehicle. Machines, 10(5): 378. doi: 10.3390/machines10050378.
  • Liu, Y.Q., Du, X., Shen, H.L., Chen, S.J. (2020). Estimating generalized gaussian blur kernels for out-of-focus image deblurring. IEEE Transactions on circuits and systems for video technology, 31(3): 829-843. doi: 10.1109/TCSVT.2020.2990623.
  • Liu, Z., Cai, Y., Wang, H., Chen, L. (2021). Surrounding objects detection and tracking for autonomous driving using LiDAR and radar fusion. Chinese Journal of Mechanical Engineering, 34: 1-12. doi: 10.1186/s10033-021-00630-y.
  • Omrani, E., Mousazadeh, H., Omid, M., Masouleh, M.T., Jafarbiglu, H., Salmani-Zakaria, Y., Makhsoos, A., Monhaseri, F., Kiapei, A. (2020). Dynamic and static object detection and tracking in an autonomous surface vehicle. Ships and Offshore Structures, 15(7): 711-721. doi: 10.1080/17445302.2019.1668642.
  • Rahman, A.N., Heriana, O., Putranto, P., Darwis, F., Pristianto, E.J., Wijayanto, Y.N. (2017). Morphological dilation for radar image enhancement. 2017 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET), 2017, s. 68-71, Jakarta.
  • Stateczny, A., Kazimierski, W., Burdziakowski, P., Motyl, W., Wisniewska, M. (2019). Shore construction detection by automotive radar for the needs of autonomous surface vehicle navigation. ISPRS International Journal of Geo-Information, 8(2): 80. doi: 10.3390/ijgi8020080.
  • Stroescu, A., Daniel, L., Gashinova, M. (2020). Combined object detection and tracking on high resolution radar imagery for autonomous driving using deep neural networks and particle filters. 2020 IEEE Radar Conference (RadarConf20), 2020, s. 1-6, Florence.
  • Taipalmaa, J., Passalis, N., Raitoharju, J. (2020). Different color spaces in deep learning-based water segmentation for autonomous marine operations. 2020 IEEE International Conference on Image Processing (ICIP), 2020, s. 3169-3173), Abu Dhabi.
  • Tang, X., Zhang, Z., Qin, Y. (2021). On-road object detection and tracking based on radar and vision fusion: A review. IEEE Intelligent Transportation Systems Magazine, 14(5): 103-128. doi: 10.1109/MITS.2021.3093379.
  • Uğurlu, H. (2023). Application of combined SWOT and AHP (A’WOT): A case study for maritime autonomous surface ships. Turkish Journal of Maritime and Marine Sciences, 9(2): 129-147. doi: 10.52998/trjmms.1365603.
  • Uğurlu, H. (2024). Quantification of the head-on situation under Rule 14 of COLREGs with modeling of ships. Ocean & Coastal Management, 255, 107261. doi: 10.1016/j.ocecoaman.2024.107261.
  • Uğurlu, H., Djecevic, O., Çiçek, I. (2024). A Conceptual COLREGs-based Obstacle Avoidance Algorithm Implementing Dynamic Path Planning and Collision Risk Assessment. Journal of ETA Maritime Science, 12(4): 377-394. doi: 10.4274/jems.2024.35682.
  • Xu, X., Wu, B., Teixeira, Â.P., Yan, X., Soares, C.G. (2023). Integration of radar sequential images and AIS for ship speed and heading estimation under uncertainty. IEEE Transactions on Intelligent Transportation Systems, 25(6): 5688-5702. doi: 10.1109/TITS.2023.3338293.
  • Yu, Q., Wang, B., Su, Y. (2021). Object detection-tracking algorithm for unmanned surface vehicles based on a radar-photoelectric system. IEEE Access, 9: 57529-57541. doi: 10.1109/ACCESS.2021.3072897

Radar image-based object detection and tracking for autonomous surface vehicles

Year 2025, Issue: Advanced Online Publication, 1 - 13
https://doi.org/10.52998/trjmms.1743670

Abstract

This study presents a modular, lightweight framework for radar image-based object detection and tracking specifically designed for autonomous surface vehicles (ASVs). The detection module employs HSV (Hue, Saturation, Value) color-space segmentation to identify navigational targets, particularly yellow-encoded objects, and the own ship indicator, typically rendered in white on commercial marine radar screens. To enhance stability under varying sea conditions, the system applies Gaussian smoothing, morphological operations, and area-based filtering during preprocessing. A dynamic multi-target tracking module assigns persistent object identifiers (ID) using Euclidean distance-based association scheme enhanced with velocity and direction-aware ID recycling. This approach mitigates identity fragmentation caused by occlusions or abrupt maneuvers. Furthermore, the framework estimates each target’s relative velocity, enabling inference of behaviors such as approaching or retreating from the own ship. The proposed detection method, evaluated on 7,200 radar frames, achieved a 98.7% true positive rate, improving by 4.4% over the area-based baseline and reducing the false positive rate by 59%. This demonstrates robust target discrimination in noisy and cluttered conditions. The proposed tracking system further reduced ID switches by 75% and decreased total generated identity count by 58% compared to positional-only baseline, indicating significant gains in tracking continuity and identity stability. The average processing time of 240 ± 17 milliseconds per frame confirms the framework’s suitability for real-time embedded deployment in dynamic maritime environments.

Ethical Statement

No ethics committee permission is required for this study.

Supporting Institution

No funding was received from institutions or agencies for the execution of this research.

Thanks

The data used in this study were kindly provided by Fatsa Faculty of Marine Sciences, Ordu University.

References

  • Barekar, P.V., Singh, K.R. (2024). Analysis of different noise filtering techniques for object detection and tracking from video with varying illumination. Journal of Statistics and Management Systems, 27(2): 303-313. doi: 10.47974/JSMS-1256.
  • Cheng, Y., Xu, H., Liu, Y. (2021). Robust small object detection on the water surface through fusion of camera and millimeter wave radar. IEEE/CVF International Conference on Computer Vision (ICCV), 2021, s. 15263-15272, Montreal.
  • Jiang, Y., Dong, L., Liang, J. (2022). Image enhancement of maritime infrared targets based on scene discrimination. Sensors, 22(15): 5873. doi: 10.3390/s22155873.
  • Kim, H., Kim, D., Lee, S.M. (2023). Marine object segmentation and tracking by learning marine radar images for autonomous surface vehicles. IEEE Sensors Journal, 23(9): 10062-10070. doi: 10.1109/JSEN.2023.3259471.
  • Kim, K., Kim, J., Kim, J. (2021). Robust data association for multi-object detection in maritime environments using camera and radar measurements. IEEE Robotics and Automation Letters, 6(3): 5865-5872. doi: 10.1109/LRA.2021.3084891.
  • Kristan, M., Kenk, V.S., Kovačič, S., Perš, J. (2015). Fast image-based obstacle detection from unmanned surface vehicles. IEEE transactions on cybernetics, 46(3): 641-654. doi: 10.1109/TCYB.2015.2412251.
  • Lee, M.F.R., Lin, C.Y. (2022). Object tracking for an autonomous unmanned surface vehicle. Machines, 10(5): 378. doi: 10.3390/machines10050378.
  • Liu, Y.Q., Du, X., Shen, H.L., Chen, S.J. (2020). Estimating generalized gaussian blur kernels for out-of-focus image deblurring. IEEE Transactions on circuits and systems for video technology, 31(3): 829-843. doi: 10.1109/TCSVT.2020.2990623.
  • Liu, Z., Cai, Y., Wang, H., Chen, L. (2021). Surrounding objects detection and tracking for autonomous driving using LiDAR and radar fusion. Chinese Journal of Mechanical Engineering, 34: 1-12. doi: 10.1186/s10033-021-00630-y.
  • Omrani, E., Mousazadeh, H., Omid, M., Masouleh, M.T., Jafarbiglu, H., Salmani-Zakaria, Y., Makhsoos, A., Monhaseri, F., Kiapei, A. (2020). Dynamic and static object detection and tracking in an autonomous surface vehicle. Ships and Offshore Structures, 15(7): 711-721. doi: 10.1080/17445302.2019.1668642.
  • Rahman, A.N., Heriana, O., Putranto, P., Darwis, F., Pristianto, E.J., Wijayanto, Y.N. (2017). Morphological dilation for radar image enhancement. 2017 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET), 2017, s. 68-71, Jakarta.
  • Stateczny, A., Kazimierski, W., Burdziakowski, P., Motyl, W., Wisniewska, M. (2019). Shore construction detection by automotive radar for the needs of autonomous surface vehicle navigation. ISPRS International Journal of Geo-Information, 8(2): 80. doi: 10.3390/ijgi8020080.
  • Stroescu, A., Daniel, L., Gashinova, M. (2020). Combined object detection and tracking on high resolution radar imagery for autonomous driving using deep neural networks and particle filters. 2020 IEEE Radar Conference (RadarConf20), 2020, s. 1-6, Florence.
  • Taipalmaa, J., Passalis, N., Raitoharju, J. (2020). Different color spaces in deep learning-based water segmentation for autonomous marine operations. 2020 IEEE International Conference on Image Processing (ICIP), 2020, s. 3169-3173), Abu Dhabi.
  • Tang, X., Zhang, Z., Qin, Y. (2021). On-road object detection and tracking based on radar and vision fusion: A review. IEEE Intelligent Transportation Systems Magazine, 14(5): 103-128. doi: 10.1109/MITS.2021.3093379.
  • Uğurlu, H. (2023). Application of combined SWOT and AHP (A’WOT): A case study for maritime autonomous surface ships. Turkish Journal of Maritime and Marine Sciences, 9(2): 129-147. doi: 10.52998/trjmms.1365603.
  • Uğurlu, H. (2024). Quantification of the head-on situation under Rule 14 of COLREGs with modeling of ships. Ocean & Coastal Management, 255, 107261. doi: 10.1016/j.ocecoaman.2024.107261.
  • Uğurlu, H., Djecevic, O., Çiçek, I. (2024). A Conceptual COLREGs-based Obstacle Avoidance Algorithm Implementing Dynamic Path Planning and Collision Risk Assessment. Journal of ETA Maritime Science, 12(4): 377-394. doi: 10.4274/jems.2024.35682.
  • Xu, X., Wu, B., Teixeira, Â.P., Yan, X., Soares, C.G. (2023). Integration of radar sequential images and AIS for ship speed and heading estimation under uncertainty. IEEE Transactions on Intelligent Transportation Systems, 25(6): 5688-5702. doi: 10.1109/TITS.2023.3338293.
  • Yu, Q., Wang, B., Su, Y. (2021). Object detection-tracking algorithm for unmanned surface vehicles based on a radar-photoelectric system. IEEE Access, 9: 57529-57541. doi: 10.1109/ACCESS.2021.3072897
There are 20 citations in total.

Details

Primary Language English
Subjects Marine Electronics, Control and Automation
Journal Section Research Article
Authors

Sercan Külcü 0000-0002-4871-709X

Submission Date July 16, 2025
Acceptance Date September 1, 2025
Early Pub Date October 1, 2025
Published in Issue Year 2025 Issue: Advanced Online Publication

Cite

APA Külcü, S. (2025). Radar image-based object detection and tracking for autonomous surface vehicles. Turkish Journal of Maritime and Marine Sciences(Advanced Online Publication), 1-13. https://doi.org/10.52998/trjmms.1743670

Creative Commons Lisansı

This Journal is licensed with Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Licence (CC BY-NC-ND 4.0).