Research Article
BibTex RIS Cite

Radar image-based object detection and tracking for autonomous surface vehicles

Year 2026, Volume: 12 Issue: 1 , 35 - 47 , 01.03.2026
https://doi.org/10.52998/trjmms.1743670
https://izlik.org/JA92FC63LR

Abstract

This study presents a modular and lightweight framework for radar image-based object detection and tracking, specifically designed for autonomous surface vehicles (ASVs). The detection module uses HSV (Hue, Saturation, Value) color-space segmentation to identify navigational targets, particularly yellow-encoded objects, and the own ship indicator, typically displayed in white on commercial marine radar screens. To enhance stability under varying sea conditions, the system applies Gaussian smoothing, morphological operations, and area-based filtering during preprocessing. The dynamic multi-target tracking module assigns persistent object identifiers (ID) using a Euclidean distance-based association scheme enhanced with velocity and direction-aware ID recovery. This approach reduces ID fragmentation caused by occlusions or sudden maneuvers. Furthermore, the framework estimates relative speed of each target, enabling inference of behaviors such as approaching or moving away from the own ship. The proposed detection method, evaluated on 7,200 radar frames, achieved a true positive rate of 98.7%, a 4.4% improving over the area-based baseline and a 59% reduction in the false positive rate. This demonstrates robust target discrimination in noisy and complex conditions. The proposed tracking system achieved significant gains in tracking continuity and identification stability, reducing ID switches by 75% and decreasing the total number of generated IDs by 58% compared to positional-only baseline. The average processing time of 240 ± 17 milliseconds per frame validates the framework’s suitability for real-time embedded deployment in dynamic maritime environments.

Ethical Statement

No ethics committee permission is required for this study.

Supporting Institution

No funding was received from institutions or agencies for the execution of this research.

Thanks

The data used in this study were kindly provided by Fatsa Faculty of Marine Sciences, Ordu University.

References

  • Barekar, P.V., Singh, K.R. (2024). Analysis of different noise filtering techniques for object detection and tracking from video with varying illumination. Journal of Statistics and Management Systems, 27(2): 303-313. doi: 10.47974/JSMS-1256.
  • Cheng, Y., Xu, H., Liu, Y. (2021). Robust small object detection on the water surface through fusion of camera and millimeter wave radar. IEEE/CVF International Conference on Computer Vision (ICCV), 2021, s. 15263-15272, Montreal.
  • Jiang, Y., Dong, L., Liang, J. (2022). Image enhancement of maritime infrared targets based on scene discrimination. Sensors, 22(15): 5873. doi: 10.3390/s22155873.
  • Kim, H., Kim, D., Lee, S.M. (2023). Marine object segmentation and tracking by learning marine radar images for autonomous surface vehicles. IEEE Sensors Journal, 23(9): 10062-10070. doi: 10.1109/JSEN.2023.3259471.
  • Kim, K., Kim, J., Kim, J. (2021). Robust data association for multi-object detection in maritime environments using camera and radar measurements. IEEE Robotics and Automation Letters, 6(3): 5865-5872. doi: 10.1109/LRA.2021.3084891.
  • Kristan, M., Kenk, V.S., Kovačič, S., Perš, J. (2015). Fast image-based obstacle detection from unmanned surface vehicles. IEEE transactions on cybernetics, 46(3): 641-654. doi: 10.1109/TCYB.2015.2412251.
  • Lee, M.F.R., Lin, C.Y. (2022). Object tracking for an autonomous unmanned surface vehicle. Machines, 10(5): 378. doi: 10.3390/machines10050378.
  • Liu, Y.Q., Du, X., Shen, H.L., Chen, S.J. (2020). Estimating generalized gaussian blur kernels for out-of-focus image deblurring. IEEE Transactions on circuits and systems for video technology, 31(3): 829-843. doi: 10.1109/TCSVT.2020.2990623.
  • Liu, Z., Cai, Y., Wang, H., Chen, L. (2021). Surrounding objects detection and tracking for autonomous driving using LiDAR and radar fusion. Chinese Journal of Mechanical Engineering, 34: 1-12. doi: 10.1186/s10033-021-00630-y.
  • Omrani, E., Mousazadeh, H., Omid, M., Masouleh, M.T., Jafarbiglu, H., Salmani-Zakaria, Y., Makhsoos, A., Monhaseri, F., Kiapei, A. (2020). Dynamic and static object detection and tracking in an autonomous surface vehicle. Ships and Offshore Structures, 15(7): 711-721. doi: 10.1080/17445302.2019.1668642.
  • Rahman, A.N., Heriana, O., Putranto, P., Darwis, F., Pristianto, E.J., Wijayanto, Y.N. (2017). Morphological dilation for radar image enhancement. 2017 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET), 2017, s. 68-71, Jakarta.
  • Stateczny, A., Kazimierski, W., Burdziakowski, P., Motyl, W., Wisniewska, M. (2019). Shore construction detection by automotive radar for the needs of autonomous surface vehicle navigation. ISPRS International Journal of Geo-Information, 8(2): 80. doi: 10.3390/ijgi8020080.
  • Stroescu, A., Daniel, L., Gashinova, M. (2020). Combined object detection and tracking on high resolution radar imagery for autonomous driving using deep neural networks and particle filters. 2020 IEEE Radar Conference (RadarConf20), 2020, s. 1-6, Florence.
  • Taipalmaa, J., Passalis, N., Raitoharju, J. (2020). Different color spaces in deep learning-based water segmentation for autonomous marine operations. 2020 IEEE International Conference on Image Processing (ICIP), 2020, s. 3169-3173), Abu Dhabi.
  • Tang, X., Zhang, Z., Qin, Y. (2021). On-road object detection and tracking based on radar and vision fusion: A review. IEEE Intelligent Transportation Systems Magazine, 14(5): 103-128. doi: 10.1109/MITS.2021.3093379.
  • Uğurlu, H. (2023). Application of combined SWOT and AHP (A’WOT): A case study for maritime autonomous surface ships. Turkish Journal of Maritime and Marine Sciences, 9(2): 129-147. doi: 10.52998/trjmms.1365603.
  • Uğurlu, H. (2024). Quantification of the head-on situation under Rule 14 of COLREGs with modeling of ships. Ocean & Coastal Management, 255, 107261. doi: 10.1016/j.ocecoaman.2024.107261.
  • Uğurlu, H., Djecevic, O., Çiçek, I. (2024). A Conceptual COLREGs-based Obstacle Avoidance Algorithm Implementing Dynamic Path Planning and Collision Risk Assessment. Journal of ETA Maritime Science, 12(4): 377-394. doi: 10.4274/jems.2024.35682.
  • Xu, X., Wu, B., Teixeira, Â.P., Yan, X., Soares, C.G. (2023). Integration of radar sequential images and AIS for ship speed and heading estimation under uncertainty. IEEE Transactions on Intelligent Transportation Systems, 25(6): 5688-5702. doi: 10.1109/TITS.2023.3338293.
  • Yu, Q., Wang, B., Su, Y. (2021). Object detection-tracking algorithm for unmanned surface vehicles based on a radar-photoelectric system. IEEE Access, 9: 57529-57541. doi: 10.1109/ACCESS.2021.3072897

Otonom yüzey araçları için radar görüntüsü tabanlı nesne tespiti ve takibi

Year 2026, Volume: 12 Issue: 1 , 35 - 47 , 01.03.2026
https://doi.org/10.52998/trjmms.1743670
https://izlik.org/JA92FC63LR

Abstract

Bu çalışma, otonom su üstü araçları için tasarlanmış, radar görüntüsü tabanlı nesne tespiti ve çoklu hedef takibi için modüler ve düşük kaynak tüketen bir çerçeve sunmaktadır. Tespit modülü, ticari deniz radar ekranlarında genellikle beyaz renkle gösterilen gemi konum işaretçisi ve özellikle sarı ile kodlanmış seyir hedeflerini tespit amacıyla HSV (Ton, Doygunluk, Değer) renk uzayında bölütleme yöntemini kullanır. Değişken deniz koşulları altında algılama kararlılığını artırmak için sistem, ön işleme aşamasında Gauss yumuşatma, morfolojik işlemler ve alan tabanlı filtreleme uygular. Dinamik çoklu hedef izleme modülü, hedeflere kalıcı kimlikler atamak için hız ve yön farkındalığına sahip kimlik geri dönüşüm mekanizması ile güçlendirilmiş Öklid mesafe tabanlı eşleme stratejisi kullanmaktadır. Bu yaklaşım, geçici kayıplar veya ani manevralar sırasında meydana gelen kimlik parçalanmasını azaltmaktadır. Ayrıca sistem, her hedefin gemiye göreli hızını tahmin ederek, gemiye yaklaşma veya gemiden uzaklaşma gibi davranış örüntülerinin çıkarımını mümkün kılar. 7.200 radar karesi üzerinde değerlendirilen önerilen tespit yöntemi %98,7 doğru pozitif oranına ulaşarak alan tabanlı temel yaklaşıma kıyasla %4,4’lük bir iyileşme sağlamış; yanlış pozitif oranını ise %59 oranında azaltmıştır. Bu sonuçlar, sistemin gürültü ve parazit içeren radar verileri altında hedef ayırt etme konusunda yüksek sağlamlığa sahip olduğunu göstermektedir. Önerilen izleme sistemi, sadece konumsal bilgiye dayanan temel yaklaşıma göre kimlik değiştirme sayısını %75, toplam üretilen kimlik sayısını ise %58 oranında azaltarak takip sürekliliği ve kimlik kararlılığı açısından önemli kazanımlar sağlamıştır. Kare başına 240 ± 17 milisaniyelik işlem süresi, sistemin gerçek zamanlı gömülü denizcilik uygulamalarında kullanılabilirliğini doğrulamaktadır.

References

  • Barekar, P.V., Singh, K.R. (2024). Analysis of different noise filtering techniques for object detection and tracking from video with varying illumination. Journal of Statistics and Management Systems, 27(2): 303-313. doi: 10.47974/JSMS-1256.
  • Cheng, Y., Xu, H., Liu, Y. (2021). Robust small object detection on the water surface through fusion of camera and millimeter wave radar. IEEE/CVF International Conference on Computer Vision (ICCV), 2021, s. 15263-15272, Montreal.
  • Jiang, Y., Dong, L., Liang, J. (2022). Image enhancement of maritime infrared targets based on scene discrimination. Sensors, 22(15): 5873. doi: 10.3390/s22155873.
  • Kim, H., Kim, D., Lee, S.M. (2023). Marine object segmentation and tracking by learning marine radar images for autonomous surface vehicles. IEEE Sensors Journal, 23(9): 10062-10070. doi: 10.1109/JSEN.2023.3259471.
  • Kim, K., Kim, J., Kim, J. (2021). Robust data association for multi-object detection in maritime environments using camera and radar measurements. IEEE Robotics and Automation Letters, 6(3): 5865-5872. doi: 10.1109/LRA.2021.3084891.
  • Kristan, M., Kenk, V.S., Kovačič, S., Perš, J. (2015). Fast image-based obstacle detection from unmanned surface vehicles. IEEE transactions on cybernetics, 46(3): 641-654. doi: 10.1109/TCYB.2015.2412251.
  • Lee, M.F.R., Lin, C.Y. (2022). Object tracking for an autonomous unmanned surface vehicle. Machines, 10(5): 378. doi: 10.3390/machines10050378.
  • Liu, Y.Q., Du, X., Shen, H.L., Chen, S.J. (2020). Estimating generalized gaussian blur kernels for out-of-focus image deblurring. IEEE Transactions on circuits and systems for video technology, 31(3): 829-843. doi: 10.1109/TCSVT.2020.2990623.
  • Liu, Z., Cai, Y., Wang, H., Chen, L. (2021). Surrounding objects detection and tracking for autonomous driving using LiDAR and radar fusion. Chinese Journal of Mechanical Engineering, 34: 1-12. doi: 10.1186/s10033-021-00630-y.
  • Omrani, E., Mousazadeh, H., Omid, M., Masouleh, M.T., Jafarbiglu, H., Salmani-Zakaria, Y., Makhsoos, A., Monhaseri, F., Kiapei, A. (2020). Dynamic and static object detection and tracking in an autonomous surface vehicle. Ships and Offshore Structures, 15(7): 711-721. doi: 10.1080/17445302.2019.1668642.
  • Rahman, A.N., Heriana, O., Putranto, P., Darwis, F., Pristianto, E.J., Wijayanto, Y.N. (2017). Morphological dilation for radar image enhancement. 2017 International Conference on Radar, Antenna, Microwave, Electronics, and Telecommunications (ICRAMET), 2017, s. 68-71, Jakarta.
  • Stateczny, A., Kazimierski, W., Burdziakowski, P., Motyl, W., Wisniewska, M. (2019). Shore construction detection by automotive radar for the needs of autonomous surface vehicle navigation. ISPRS International Journal of Geo-Information, 8(2): 80. doi: 10.3390/ijgi8020080.
  • Stroescu, A., Daniel, L., Gashinova, M. (2020). Combined object detection and tracking on high resolution radar imagery for autonomous driving using deep neural networks and particle filters. 2020 IEEE Radar Conference (RadarConf20), 2020, s. 1-6, Florence.
  • Taipalmaa, J., Passalis, N., Raitoharju, J. (2020). Different color spaces in deep learning-based water segmentation for autonomous marine operations. 2020 IEEE International Conference on Image Processing (ICIP), 2020, s. 3169-3173), Abu Dhabi.
  • Tang, X., Zhang, Z., Qin, Y. (2021). On-road object detection and tracking based on radar and vision fusion: A review. IEEE Intelligent Transportation Systems Magazine, 14(5): 103-128. doi: 10.1109/MITS.2021.3093379.
  • Uğurlu, H. (2023). Application of combined SWOT and AHP (A’WOT): A case study for maritime autonomous surface ships. Turkish Journal of Maritime and Marine Sciences, 9(2): 129-147. doi: 10.52998/trjmms.1365603.
  • Uğurlu, H. (2024). Quantification of the head-on situation under Rule 14 of COLREGs with modeling of ships. Ocean & Coastal Management, 255, 107261. doi: 10.1016/j.ocecoaman.2024.107261.
  • Uğurlu, H., Djecevic, O., Çiçek, I. (2024). A Conceptual COLREGs-based Obstacle Avoidance Algorithm Implementing Dynamic Path Planning and Collision Risk Assessment. Journal of ETA Maritime Science, 12(4): 377-394. doi: 10.4274/jems.2024.35682.
  • Xu, X., Wu, B., Teixeira, Â.P., Yan, X., Soares, C.G. (2023). Integration of radar sequential images and AIS for ship speed and heading estimation under uncertainty. IEEE Transactions on Intelligent Transportation Systems, 25(6): 5688-5702. doi: 10.1109/TITS.2023.3338293.
  • Yu, Q., Wang, B., Su, Y. (2021). Object detection-tracking algorithm for unmanned surface vehicles based on a radar-photoelectric system. IEEE Access, 9: 57529-57541. doi: 10.1109/ACCESS.2021.3072897
There are 20 citations in total.

Details

Primary Language English
Subjects Marine Electronics, Control and Automation
Journal Section Research Article
Authors

Sercan Külcü 0000-0002-4871-709X

Submission Date July 16, 2025
Acceptance Date September 1, 2025
Early Pub Date October 1, 2025
Publication Date March 1, 2026
DOI https://doi.org/10.52998/trjmms.1743670
IZ https://izlik.org/JA92FC63LR
Published in Issue Year 2026 Volume: 12 Issue: 1

Cite

APA Külcü, S. (2026). Radar image-based object detection and tracking for autonomous surface vehicles. Turkish Journal of Maritime and Marine Sciences, 12(1), 35-47. https://doi.org/10.52998/trjmms.1743670

Creative Commons Lisansı

This Journal is licensed with Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Licence (CC BY-NC-ND 4.0).