Yıl 2024,
Cilt: 9 Sayı: 3, 368 - 376
Simla Özbayrak
,
Veli İlçi
Proje Numarası
PYO.MUH.1906.22.002, PYO.MUH.1908.22.079
Kaynakça
- 1. Uçarlı, A. C., İlçi, V., Par, K., & Peker, A. U. (2022). Otonom araçlarda çoklu GNSS uydu sistemleri kullanımının konum doğruluğuna etkisinin araştırılması. Ömer Halisdemir Üniversitesi Mühendislik Bilimleri Dergisi, 11(3), 672–680. https://doi.org/10.28948/ngumuh.1082124
- 2. Li, N., Guan, L., Gao, Y., Du, S., Wu, M., Guang, X., & Cong, X. (2020). Indoor and outdoor low-cost seamless integrated navigation system based on the integration of INS/GNSS/LIDAR system. Remote Sensing, 12(19), 1–21.
https://doi.org/10.3390/rs12193271
- 3. Yurdakul, Ö., & Kalaycı, İ. (2022). The effect of GLONASS on position accuracy in CORS-TR measurements at different baseline distances. International Journal of Engineering and Geosciences, 7(3), 229–246.
https://doi.org/10.26833/ijeg.975204
- 4. Uçarlı, A. C., Demir, F., Erol, S., & Alkan, R. M. (2020). Farklı GNSS Uydu Sistemlerinin Hassas Nokta Konumlama (PPP) Tekniğinin Performansına Etkisinin İncelenmesi. Geomatik, 6(3), 247–258. https://doi.org/10.29128/geomatik.779420
- 5. Ilci, V., & Toth, C. (2020). High definition 3D map creation using GNSS/IMU/LiDAR sensor integration to support autonomous vehicle navigation. Sensors, 20(3), 899. https://doi.org/10.3390/s20030899
- 6. Atiz, O. F., Konukseven, C., Ogutcu, S., & Alcay, S. (2022). Comparative analysis of the performance of Multi-GNSS RTK: A case study in Turkey. International Journal of Engineering and Geosciences, 7(1), 67–80.
https://doi.org/10.26833/ijeg.878236
- 7. İlci, V. (2020). CenterPoint RTX Teknolojisinin Doğruluk ve Tekrarlana bilirliğinin Araştırılması. Geomatik, 5(1), 10–18.
https://doi.org/10.29128/geomatik.560026
- 8. Gurturk, M., & Ilci, V. (2022). The performance evaluation of PPK and PPP-based Loosely Coupled integration in wooded and urban areas. Earth Sciences Research Journal, 26(3), 211–220. https://doi.org/10.15446/esrj.v26n3.100518
- 9. Kuang, J., Niu, X., & Chen, X. (2018). Robust pedestrian dead reckoning based on MEMS-IMU for smartphones. Sensors, 18(5), 1391.
https://doi.org/10.3390/s18051391
- 10. İlçi, V., Gülal, E., & Alkan, R. M. (2018). An investigation of different Wi-Fi signal behaviours and their effects on indoor positioning accuracy. Survey Review, 50(362), 404–411.
https://doi.org/10.1080/00396265.2017.1292672
- 11. İlçi, V., Gülal, E., & Alkan, R. M. (2020). Performance Comparison of 2.4 and 5 GHz WiFi Signals and Proposing a New Method for Mobile Indoor Positioning. Wireless Personal Communications, 110(3), 1493–1511.
https://doi.org/10.1007/s11277-019-06797-x
- 12. Wang, J., Wang, M., Yang, D., Liu, F., & Wen, Z. (2021). UWB positioning algorithm and accuracy evaluation for different indoor scenes. International Journal of Image and Data Fusion, 12(3), 203–225. https://doi.org/10.1080/19479832.2020.1864788
- 13. Motroni, A., Buffi, A., & Nepa, P. (2021). A Survey on Indoor Vehicle Localization through RFID Technology. IEEE Access, 9, 17921–17942. https://doi.org/10.1109/ACCESS.2021.3052316
- 14. Zhong, J., Li, M., Liao, X., & Qin, J. (2020). A real-time infrared stereo matching algorithm for RGB-D cameras’ indoor 3D perception. ISPRS International Journal of Geo-Information, 9(8), 1–16.
https://doi.org/10.3390/ijgi9080472
- 15. Tarık, T., & ÖCALAN, T. (2020). PPK GNSS Sistemine Sahip İnsansız Hava Araçları İle Elde Edilen Fotogrametrik Ürünlerin Doğruluğunun Farklı Yaklaşımlarla İrdelenmesi. Türkiye Fotogrametri Dergisi, 2(53476), 22–28.
- 16. Ceylan, M. C., & Uysal, M. (2021). İnsansız Hava Aracı ile Elde Edilen Veriler Yardımıyla Ağaç Çıkarımı. Türkiye Fotogrametri Dergisi, 3(1), 15–21. https://doi.org/10.53030/tufod.912501
- 17. Guidi, F., Guerra, A., & Dardari, D. (2016). Personal mobile radars with millimeter-wave massive arrays for indoor mapping. IEEE Transactions on Mobile Computing, 15(6), 1471–1484.
https://doi.org/10.1109/TMC.2015.2467373
- 18. Petrlik, M., Krajnik, T., & Saska, M. (2021). LIDAR-based Stabilization, Navigation and Localization for UAVs Operating in Dark Indoor Environments. 2021 International Conference on Unmanned Aircraft Systems, ICUAS 2021, (871479), 243–251. https://doi.org/10.1109/ICUAS51884.2021.9476837
- 19. Nazari, S. W., Akarsu, V., & Yakar, M. (2023). Analysis of 3D Laser Scanning Data of Farabi Mosque Using Various Softwaren. Advanced LiDAR, 3(1), 22–34.
- 20. Zeybek, M., & Ediz, D. (2022). Detection of Road Distress with Mobile Phone LiDAR Sensors. Advanced Geomatics, 2(2), 48–53.
- 21. Bala, J. A., Adeshina, S. A., & Aibinu, A. M. (2022). Advances in Visual Simultaneous Localisation and Mapping Techniques for Autonomous Vehicles: A Review. Sensors, 22(22), 8943. https://doi.org/10.3390/s22228943
- 22. Ren, R., Fu, H., & Wu, M. (2019). Large-scale outdoor slam based on 2d lidar. Electronics, 8(6), 613.
https://doi.org/10.3390/electronics8060613
- 23. Palomer, A., Ridao, P., & Ribas, D. (2019). Inspection of an underwater structure using point‐cloud.pdf. Journal of Field Robotics, 36, 1333–1344. https://doi.org/10.1002/rob.21907
- 24. Hong, Z., Petillot, Y., Wallace, A., & Wang, S. (2022). RadarSLAM: A robust simultaneous localization and mapping system for all weather conditions. International Journal of Robotics Research, 41(5), 519–542. https://doi.org/10.1177/02783649221080483
- 25. Kazerouni, I. A., Fitzgerald, L., Dooly, G., & Toal, D. (2022). A survey of state-of-the-art on visual SLAM. Expert Systems with Applications, 205, 117734. https://doi.org/10.1016/j.eswa.2022.117734
- 26. Dai, Y., Wu, J., Wang, D., & Watanabe, K. (2023). A Review of Common Techniques for Visual Simultaneous Localization and Mapping. Journal of Robotics, 2023, 872822.
https://doi.org/10.1155/2023/8872822
- 27. Mur-Artal, R., & Tardos, J. D. (2017). ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Transactions on Robotics, 33(5), 1255–1262.
https://doi.org/10.1109/TRO.2017.2705103
- 28. Taketomi, T., Uchiyama, H., & Ikeda, S. (2017). Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ Transactions on Computer Vision and Applications, 9(16).
https://doi.org/10.1186/s41074-017-0027-2
- 29. Zhang, S., Zhao, S., An, D., Liu, J., Wang, H., Feng, Y., … Zhao, R. (2022). Visual SLAM for underwater vehicles: A survey. Computer Science Review, 46, 100510. https://doi.org/10.1016/j.cosrev.2022.100510
- 30. Chen, W., Shang, G., Ji, A., Zhou, C., Wang, X., Xu, C., … Hu, K. (2022). An Overview on Visual SLAM: From Tradition to Semantic. Remote Sensing, 14(13), 1–47. https://doi.org/10.3390/rs14133010
- 31. Fayyad, J., Jaradat, M. A., Gruyer, D., & Najjaran, H. (2020). Deep learning sensor fusion for autonomous vehicle perception and localization: A review. Sensors, 20(15), 4220.
https://doi.org/10.3390/s20154220
- 32. Debeunne, C., & Vivet, D. (2020). A review of visual-lidar fusion based simultaneous localization and mapping. Sensors, 20(7), 2068. https://doi.org/10.3390/s20072068
- 33. Wang, H., Wang, C., & Xie, L. (2021). Intensity-SLAM: Intensity Assisted Localization and Mapping for Large Scale Environment. IEEE Robotics and Automation Letters, 6(2), 1715–1721. https://doi.org/10.1109/LRA.2021.3059567
- 34. Gostar, A. K., Fu, C., Chuah, W., Hossain, M. I., Tennakoon, R., Bab-Hadiashar, A., & Hoseinnezhad, R. (2019). State transition for statistical SLAM using planar features in 3D point clouds. Sensors, 19(7), 1614. https://doi.org/10.3390/s19071614
- 35. Takleh, T. T. O., Bakar, N. A., Rahman, S. A., Hamzah, R., & Aziz, Z. A. (2018). A brief survey on SLAM methods in autonomous vehicle. International Journal of Engineering and Technology (UAE), 7(4), 38–43.
https://doi.org/10.14419/ijet.v7i4.27.22477
- 36. Wen, W., Zhang, G., & Hsu, L. T. (2020). Object-Detection-Aided GNSS and Its Integration with Lidar in Highly Urbanized Areas. IEEE Intelligent Transportation Systems Magazine, 12(3), 53–69. https://doi.org/10.1109/MITS.2020.2994131
- 37. Min, H., Wu, X., Cheng, C., & Zhao, X. (2019). Kinematic and Dynamic Vehicle Model-Assisted Global Positioning Method for Autonomous Vehicles with Low-Cost GPS/Camera/In-Vehicle Sensors. Sensors, 19(24), 5430.
https://doi.org/10.3390/s19245430
- 38. Van Brummelen, J., O’Brien, M., Gruyer, D., & Najjaran, H. (2018). Autonomous vehicle perception: The technology of today and tomorrow. Transportation Research Part C: Emerging Technologies, 89(July 2017), 384–406.
https://doi.org/10.1016/j.trc.2018.02.012
- 39. Zhang, Y., Chen, L., XuanYuan, Z., & Tian, W. (2020). Three-Dimensional Cooperative Mapping for Connected and Automated Vehicles. IEEE Transactions on Industrial Electronics, 67(8), 6649–6658.
https://doi.org/10.1109/TIE.2019.2931521
- 40. Dwijotomo, A., Rahman, M. A. A., Ariff, M. H. M., Zamzuri, H., & Azree, W. M. H. W. (2020). Cartographer SLAM method for optimization with an adaptive multi-distance scan scheduler. Applied Sciences, 10(1), 347.
https://doi.org/10.3390/app10010347
- 41. Wu, Y., Li, Y., Li, W., Li, H., & Lu, R. (2021). Robust Lidar-Based Localization Scheme for Unmanned Ground Vehicle via Multisensor Fusion. IEEE Transactions on Neural Networks and Learning Systems, 32(12), 5633–5643.
https://doi.org/10.1109/TNNLS.2020.3027983
- 42. Zou, D., Tan, P., & Yu, W. (2019). Collaborative visual SLAM for multiple agents:A brief survey. Virtual Reality and Intelligent Hardware, 1(5), 461–482. https://doi.org/10.1016/j.vrih.2019.09.002
- 43. Tourani, A., Bavle, H., Sanchez-Lopez, J. L., & Voos, H. (2022). Visual SLAM: What Are the Current Trends and What to Expect? Sensors, 22(23), 9297. https://doi.org/10.3390/s22239297
- 44. Chiang, K. W., Tsai, G. J., Li, Y. H., Li, Y., & El-Sheimy, N. (2020). Navigation engine design for automated driving using INS/GNSS/3D LiDAR-SLAM and integrity assessment. Remote Sensing, 12(10), 1564. https://doi.org/10.3390/rs12101564
- 45. Cheng, J., Zhang, L., Chen, Q., Hu, X., & Cai, J. (2022). A review of visual SLAM methods for autonomous driving vehicles. Engineering Applications of Artificial Intelligence, 114(October 2021), 104992.
https://doi.org/10.1016/j.engappai.2022.104992
- 46. Lin, X., Wang, F., Yang, B., & Zhang, W. (2021). Autonomous vehicle localization with prior visual point cloud map constraints in GNSS-challenged environments. Remote Sensing, 13(3), 506. https://doi.org/10.3390/rs13030506
- 47. Razali, M. R., Athif, A., Faudzi, M., & Shamsudin, A. U. (2022). Visual Simultaneous Localization and Mapping: A review. PERINTIS eJournal, 12(1), 23–34.
- 48. Lin, X., Yang, B., Wang, F., Li, J., & Wang, X. (2021). Dense 3D surface reconstruction of large-scale streetscape from vehicle-borne imagery and LiDAR. International Journal of Digital Earth, 14(5), 619–639. https://doi.org/10.1080/17538947.2020.1862318
- 49. Sharafutdinov, D., Griguletskii, M., Kopanev, P., Kurenkov, M., Ferrer, G., Burkov, A., … Tsetserukou, D. (2023). Comparison of modern open-source Visual SLAM approaches. Journal of Intelligent and Robotic Systems: Theory and Applications, 107(43), 1-22.
https://doi.org/10.1007/s10846-023-01812-7
- 50. Davison, A. J., Reid, I. D., Molton, N. D., & Stasse, O. (2007). MonoSLAM: Real-time single camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(6), 1052–1067.
https://doi.org/10.1109/TPAMI.2007.1049
- 51. Klein, G., & Murray, D. (2007). Parallel tracking and mapping for small AR workspaces. 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR, 225–234. https://doi.org/10.1109/ISMAR.2007.4538852
- 52. Beghdadi, A., & Mallem, M. (2022). A comprehensive overview of dynamic visual SLAM and deep learning: concepts, methods and challenges. Machine Vision and Applications, 33, 54. https://doi.org/10.1007/s00138-022-01306-w
- 53. Newcombe, R. A., Lovegrove, S. J., & Davison, A. J. (2011). DTAM: Dense tracking and mapping in real-time. Proceedings of the IEEE International Conference on Computer Vision, 2320–2327. https://doi.org/10.1109/ICCV.2011.6126513
- 54. Engel, J., Schöps, T., Cremers, D. (2014). LSD-SLAM: Large-Scale Direct Monocular SLAM. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds) Computer Vision – ECCV 2014. ECCV 2014. Lecture Notes in Computer Science, vol 8690. Springer, Cham. https://doi.org/10.1007/978-3-319-10605-2_54
- 55. Duan, C., Junginger, S., Huang, J., Jin, K., & Thurow, K. (2019). Deep Learning for Visual SLAM in Transportation Robotics: A review. Transportation Safety and Environment, 1(3), 177–184. https://doi.org/10.1093/tse/tdz019
- 56. Yalcin, H., & Cilasun, M. H. (2016). Derin Ögrenme Tabanli Otonom Yön Belirleme. 2016 24th Signal Processing and Communication Application Conference, SIU 2016 - Proceedings, 1645–1648. https://doi.org/10.1109/SIU.2016.7496072
- 57. Tseng, P. Y., Lin, J. J., Chan, Y. C., & Chen, A. Y. (2022). Real-time indoor localization with visual SLAM for in-building emergency response. Automation in Construction, 140, 104319. https://doi.org/10.1016/j.autcon.2022.104319
- 58. Cui, L., & Ma, C. (2019). SOF-SLAM: A Semantic Visual SLAM for Dynamic Environments. IEEE Access, 7, 166528–166539. https://doi.org/10.1109/ACCESS.2019.2952161
Visual-SLAM based 3-dimensional modelling of indoor environments
Yıl 2024,
Cilt: 9 Sayı: 3, 368 - 376
Simla Özbayrak
,
Veli İlçi
Öz
Simultaneous localization and mapping (SLAM) is used in many fields to enable robots to map their surroundings and locate themselves in new circumstances. Visual-SLAM (VSLAM), which uses a camera sensor, and LiDAR-SLAM, which uses a light detection and ranging (LiDAR) sensor, are the most prevalent SLAM methods. Thanks to its benefits, including low-cost compared to LiDAR, low energy consumption, durability, and extensive environmental data, VSLAM is currently attracting much attention. This study aims to produce a three-dimensional (3D) model of an indoor environment using image data captured by the stereo camera located on the Unmanned Ground Vehicle (UGV). Easily measured objects from the field of operation were chosen to assess the generated model’s accuracy. The actual dimensions of the objects were measured, and these values were compared to those derived from the VSLAM-based 3D model. When the data were evaluated, it was found that the size of the object produced from the model could be varied by ±2cm. The surface accuracy of the 3D model produced has also been analysed. For this investigation, areas where the walls and floor surfaces were flat in the field were selected, and the plane accuracy of these areas was analysed. The plain accuracy values of the specified surfaces were determined to be below ±1cm.
Destekleyen Kurum
Ondokuz Mayis University Scientific Research Projects
Proje Numarası
PYO.MUH.1906.22.002, PYO.MUH.1908.22.079
Teşekkür
This study was funded by Ondokuz Mayis University Scientific Research Projects (Projects No: PYO.MUH.1906.22.002, and PYO.MUH.1908.22.079). We also appreciate the LOCUS-TEAM members for their support during this study.
Kaynakça
- 1. Uçarlı, A. C., İlçi, V., Par, K., & Peker, A. U. (2022). Otonom araçlarda çoklu GNSS uydu sistemleri kullanımının konum doğruluğuna etkisinin araştırılması. Ömer Halisdemir Üniversitesi Mühendislik Bilimleri Dergisi, 11(3), 672–680. https://doi.org/10.28948/ngumuh.1082124
- 2. Li, N., Guan, L., Gao, Y., Du, S., Wu, M., Guang, X., & Cong, X. (2020). Indoor and outdoor low-cost seamless integrated navigation system based on the integration of INS/GNSS/LIDAR system. Remote Sensing, 12(19), 1–21.
https://doi.org/10.3390/rs12193271
- 3. Yurdakul, Ö., & Kalaycı, İ. (2022). The effect of GLONASS on position accuracy in CORS-TR measurements at different baseline distances. International Journal of Engineering and Geosciences, 7(3), 229–246.
https://doi.org/10.26833/ijeg.975204
- 4. Uçarlı, A. C., Demir, F., Erol, S., & Alkan, R. M. (2020). Farklı GNSS Uydu Sistemlerinin Hassas Nokta Konumlama (PPP) Tekniğinin Performansına Etkisinin İncelenmesi. Geomatik, 6(3), 247–258. https://doi.org/10.29128/geomatik.779420
- 5. Ilci, V., & Toth, C. (2020). High definition 3D map creation using GNSS/IMU/LiDAR sensor integration to support autonomous vehicle navigation. Sensors, 20(3), 899. https://doi.org/10.3390/s20030899
- 6. Atiz, O. F., Konukseven, C., Ogutcu, S., & Alcay, S. (2022). Comparative analysis of the performance of Multi-GNSS RTK: A case study in Turkey. International Journal of Engineering and Geosciences, 7(1), 67–80.
https://doi.org/10.26833/ijeg.878236
- 7. İlci, V. (2020). CenterPoint RTX Teknolojisinin Doğruluk ve Tekrarlana bilirliğinin Araştırılması. Geomatik, 5(1), 10–18.
https://doi.org/10.29128/geomatik.560026
- 8. Gurturk, M., & Ilci, V. (2022). The performance evaluation of PPK and PPP-based Loosely Coupled integration in wooded and urban areas. Earth Sciences Research Journal, 26(3), 211–220. https://doi.org/10.15446/esrj.v26n3.100518
- 9. Kuang, J., Niu, X., & Chen, X. (2018). Robust pedestrian dead reckoning based on MEMS-IMU for smartphones. Sensors, 18(5), 1391.
https://doi.org/10.3390/s18051391
- 10. İlçi, V., Gülal, E., & Alkan, R. M. (2018). An investigation of different Wi-Fi signal behaviours and their effects on indoor positioning accuracy. Survey Review, 50(362), 404–411.
https://doi.org/10.1080/00396265.2017.1292672
- 11. İlçi, V., Gülal, E., & Alkan, R. M. (2020). Performance Comparison of 2.4 and 5 GHz WiFi Signals and Proposing a New Method for Mobile Indoor Positioning. Wireless Personal Communications, 110(3), 1493–1511.
https://doi.org/10.1007/s11277-019-06797-x
- 12. Wang, J., Wang, M., Yang, D., Liu, F., & Wen, Z. (2021). UWB positioning algorithm and accuracy evaluation for different indoor scenes. International Journal of Image and Data Fusion, 12(3), 203–225. https://doi.org/10.1080/19479832.2020.1864788
- 13. Motroni, A., Buffi, A., & Nepa, P. (2021). A Survey on Indoor Vehicle Localization through RFID Technology. IEEE Access, 9, 17921–17942. https://doi.org/10.1109/ACCESS.2021.3052316
- 14. Zhong, J., Li, M., Liao, X., & Qin, J. (2020). A real-time infrared stereo matching algorithm for RGB-D cameras’ indoor 3D perception. ISPRS International Journal of Geo-Information, 9(8), 1–16.
https://doi.org/10.3390/ijgi9080472
- 15. Tarık, T., & ÖCALAN, T. (2020). PPK GNSS Sistemine Sahip İnsansız Hava Araçları İle Elde Edilen Fotogrametrik Ürünlerin Doğruluğunun Farklı Yaklaşımlarla İrdelenmesi. Türkiye Fotogrametri Dergisi, 2(53476), 22–28.
- 16. Ceylan, M. C., & Uysal, M. (2021). İnsansız Hava Aracı ile Elde Edilen Veriler Yardımıyla Ağaç Çıkarımı. Türkiye Fotogrametri Dergisi, 3(1), 15–21. https://doi.org/10.53030/tufod.912501
- 17. Guidi, F., Guerra, A., & Dardari, D. (2016). Personal mobile radars with millimeter-wave massive arrays for indoor mapping. IEEE Transactions on Mobile Computing, 15(6), 1471–1484.
https://doi.org/10.1109/TMC.2015.2467373
- 18. Petrlik, M., Krajnik, T., & Saska, M. (2021). LIDAR-based Stabilization, Navigation and Localization for UAVs Operating in Dark Indoor Environments. 2021 International Conference on Unmanned Aircraft Systems, ICUAS 2021, (871479), 243–251. https://doi.org/10.1109/ICUAS51884.2021.9476837
- 19. Nazari, S. W., Akarsu, V., & Yakar, M. (2023). Analysis of 3D Laser Scanning Data of Farabi Mosque Using Various Softwaren. Advanced LiDAR, 3(1), 22–34.
- 20. Zeybek, M., & Ediz, D. (2022). Detection of Road Distress with Mobile Phone LiDAR Sensors. Advanced Geomatics, 2(2), 48–53.
- 21. Bala, J. A., Adeshina, S. A., & Aibinu, A. M. (2022). Advances in Visual Simultaneous Localisation and Mapping Techniques for Autonomous Vehicles: A Review. Sensors, 22(22), 8943. https://doi.org/10.3390/s22228943
- 22. Ren, R., Fu, H., & Wu, M. (2019). Large-scale outdoor slam based on 2d lidar. Electronics, 8(6), 613.
https://doi.org/10.3390/electronics8060613
- 23. Palomer, A., Ridao, P., & Ribas, D. (2019). Inspection of an underwater structure using point‐cloud.pdf. Journal of Field Robotics, 36, 1333–1344. https://doi.org/10.1002/rob.21907
- 24. Hong, Z., Petillot, Y., Wallace, A., & Wang, S. (2022). RadarSLAM: A robust simultaneous localization and mapping system for all weather conditions. International Journal of Robotics Research, 41(5), 519–542. https://doi.org/10.1177/02783649221080483
- 25. Kazerouni, I. A., Fitzgerald, L., Dooly, G., & Toal, D. (2022). A survey of state-of-the-art on visual SLAM. Expert Systems with Applications, 205, 117734. https://doi.org/10.1016/j.eswa.2022.117734
- 26. Dai, Y., Wu, J., Wang, D., & Watanabe, K. (2023). A Review of Common Techniques for Visual Simultaneous Localization and Mapping. Journal of Robotics, 2023, 872822.
https://doi.org/10.1155/2023/8872822
- 27. Mur-Artal, R., & Tardos, J. D. (2017). ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Transactions on Robotics, 33(5), 1255–1262.
https://doi.org/10.1109/TRO.2017.2705103
- 28. Taketomi, T., Uchiyama, H., & Ikeda, S. (2017). Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ Transactions on Computer Vision and Applications, 9(16).
https://doi.org/10.1186/s41074-017-0027-2
- 29. Zhang, S., Zhao, S., An, D., Liu, J., Wang, H., Feng, Y., … Zhao, R. (2022). Visual SLAM for underwater vehicles: A survey. Computer Science Review, 46, 100510. https://doi.org/10.1016/j.cosrev.2022.100510
- 30. Chen, W., Shang, G., Ji, A., Zhou, C., Wang, X., Xu, C., … Hu, K. (2022). An Overview on Visual SLAM: From Tradition to Semantic. Remote Sensing, 14(13), 1–47. https://doi.org/10.3390/rs14133010
- 31. Fayyad, J., Jaradat, M. A., Gruyer, D., & Najjaran, H. (2020). Deep learning sensor fusion for autonomous vehicle perception and localization: A review. Sensors, 20(15), 4220.
https://doi.org/10.3390/s20154220
- 32. Debeunne, C., & Vivet, D. (2020). A review of visual-lidar fusion based simultaneous localization and mapping. Sensors, 20(7), 2068. https://doi.org/10.3390/s20072068
- 33. Wang, H., Wang, C., & Xie, L. (2021). Intensity-SLAM: Intensity Assisted Localization and Mapping for Large Scale Environment. IEEE Robotics and Automation Letters, 6(2), 1715–1721. https://doi.org/10.1109/LRA.2021.3059567
- 34. Gostar, A. K., Fu, C., Chuah, W., Hossain, M. I., Tennakoon, R., Bab-Hadiashar, A., & Hoseinnezhad, R. (2019). State transition for statistical SLAM using planar features in 3D point clouds. Sensors, 19(7), 1614. https://doi.org/10.3390/s19071614
- 35. Takleh, T. T. O., Bakar, N. A., Rahman, S. A., Hamzah, R., & Aziz, Z. A. (2018). A brief survey on SLAM methods in autonomous vehicle. International Journal of Engineering and Technology (UAE), 7(4), 38–43.
https://doi.org/10.14419/ijet.v7i4.27.22477
- 36. Wen, W., Zhang, G., & Hsu, L. T. (2020). Object-Detection-Aided GNSS and Its Integration with Lidar in Highly Urbanized Areas. IEEE Intelligent Transportation Systems Magazine, 12(3), 53–69. https://doi.org/10.1109/MITS.2020.2994131
- 37. Min, H., Wu, X., Cheng, C., & Zhao, X. (2019). Kinematic and Dynamic Vehicle Model-Assisted Global Positioning Method for Autonomous Vehicles with Low-Cost GPS/Camera/In-Vehicle Sensors. Sensors, 19(24), 5430.
https://doi.org/10.3390/s19245430
- 38. Van Brummelen, J., O’Brien, M., Gruyer, D., & Najjaran, H. (2018). Autonomous vehicle perception: The technology of today and tomorrow. Transportation Research Part C: Emerging Technologies, 89(July 2017), 384–406.
https://doi.org/10.1016/j.trc.2018.02.012
- 39. Zhang, Y., Chen, L., XuanYuan, Z., & Tian, W. (2020). Three-Dimensional Cooperative Mapping for Connected and Automated Vehicles. IEEE Transactions on Industrial Electronics, 67(8), 6649–6658.
https://doi.org/10.1109/TIE.2019.2931521
- 40. Dwijotomo, A., Rahman, M. A. A., Ariff, M. H. M., Zamzuri, H., & Azree, W. M. H. W. (2020). Cartographer SLAM method for optimization with an adaptive multi-distance scan scheduler. Applied Sciences, 10(1), 347.
https://doi.org/10.3390/app10010347
- 41. Wu, Y., Li, Y., Li, W., Li, H., & Lu, R. (2021). Robust Lidar-Based Localization Scheme for Unmanned Ground Vehicle via Multisensor Fusion. IEEE Transactions on Neural Networks and Learning Systems, 32(12), 5633–5643.
https://doi.org/10.1109/TNNLS.2020.3027983
- 42. Zou, D., Tan, P., & Yu, W. (2019). Collaborative visual SLAM for multiple agents:A brief survey. Virtual Reality and Intelligent Hardware, 1(5), 461–482. https://doi.org/10.1016/j.vrih.2019.09.002
- 43. Tourani, A., Bavle, H., Sanchez-Lopez, J. L., & Voos, H. (2022). Visual SLAM: What Are the Current Trends and What to Expect? Sensors, 22(23), 9297. https://doi.org/10.3390/s22239297
- 44. Chiang, K. W., Tsai, G. J., Li, Y. H., Li, Y., & El-Sheimy, N. (2020). Navigation engine design for automated driving using INS/GNSS/3D LiDAR-SLAM and integrity assessment. Remote Sensing, 12(10), 1564. https://doi.org/10.3390/rs12101564
- 45. Cheng, J., Zhang, L., Chen, Q., Hu, X., & Cai, J. (2022). A review of visual SLAM methods for autonomous driving vehicles. Engineering Applications of Artificial Intelligence, 114(October 2021), 104992.
https://doi.org/10.1016/j.engappai.2022.104992
- 46. Lin, X., Wang, F., Yang, B., & Zhang, W. (2021). Autonomous vehicle localization with prior visual point cloud map constraints in GNSS-challenged environments. Remote Sensing, 13(3), 506. https://doi.org/10.3390/rs13030506
- 47. Razali, M. R., Athif, A., Faudzi, M., & Shamsudin, A. U. (2022). Visual Simultaneous Localization and Mapping: A review. PERINTIS eJournal, 12(1), 23–34.
- 48. Lin, X., Yang, B., Wang, F., Li, J., & Wang, X. (2021). Dense 3D surface reconstruction of large-scale streetscape from vehicle-borne imagery and LiDAR. International Journal of Digital Earth, 14(5), 619–639. https://doi.org/10.1080/17538947.2020.1862318
- 49. Sharafutdinov, D., Griguletskii, M., Kopanev, P., Kurenkov, M., Ferrer, G., Burkov, A., … Tsetserukou, D. (2023). Comparison of modern open-source Visual SLAM approaches. Journal of Intelligent and Robotic Systems: Theory and Applications, 107(43), 1-22.
https://doi.org/10.1007/s10846-023-01812-7
- 50. Davison, A. J., Reid, I. D., Molton, N. D., & Stasse, O. (2007). MonoSLAM: Real-time single camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(6), 1052–1067.
https://doi.org/10.1109/TPAMI.2007.1049
- 51. Klein, G., & Murray, D. (2007). Parallel tracking and mapping for small AR workspaces. 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR, 225–234. https://doi.org/10.1109/ISMAR.2007.4538852
- 52. Beghdadi, A., & Mallem, M. (2022). A comprehensive overview of dynamic visual SLAM and deep learning: concepts, methods and challenges. Machine Vision and Applications, 33, 54. https://doi.org/10.1007/s00138-022-01306-w
- 53. Newcombe, R. A., Lovegrove, S. J., & Davison, A. J. (2011). DTAM: Dense tracking and mapping in real-time. Proceedings of the IEEE International Conference on Computer Vision, 2320–2327. https://doi.org/10.1109/ICCV.2011.6126513
- 54. Engel, J., Schöps, T., Cremers, D. (2014). LSD-SLAM: Large-Scale Direct Monocular SLAM. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds) Computer Vision – ECCV 2014. ECCV 2014. Lecture Notes in Computer Science, vol 8690. Springer, Cham. https://doi.org/10.1007/978-3-319-10605-2_54
- 55. Duan, C., Junginger, S., Huang, J., Jin, K., & Thurow, K. (2019). Deep Learning for Visual SLAM in Transportation Robotics: A review. Transportation Safety and Environment, 1(3), 177–184. https://doi.org/10.1093/tse/tdz019
- 56. Yalcin, H., & Cilasun, M. H. (2016). Derin Ögrenme Tabanli Otonom Yön Belirleme. 2016 24th Signal Processing and Communication Application Conference, SIU 2016 - Proceedings, 1645–1648. https://doi.org/10.1109/SIU.2016.7496072
- 57. Tseng, P. Y., Lin, J. J., Chan, Y. C., & Chen, A. Y. (2022). Real-time indoor localization with visual SLAM for in-building emergency response. Automation in Construction, 140, 104319. https://doi.org/10.1016/j.autcon.2022.104319
- 58. Cui, L., & Ma, C. (2019). SOF-SLAM: A Semantic Visual SLAM for Dynamic Environments. IEEE Access, 7, 166528–166539. https://doi.org/10.1109/ACCESS.2019.2952161