Research Article
BibTex RIS Cite
Year 2023, Volume: 65 Issue: 1, 1 - 15, 03.06.2023
https://doi.org/10.33769/aupse.1195379

Abstract

References

  • Yousif, K., Bab-Hadiashar, A., Hoseinnezhad, R., An overview to visual odometry and visual SLAM: Applications to mobile robotics, Intell. Ind. Syst., 1 (4) (2015), 289-311, https://doi.org/10.1007/s40903-015-0032-7.
  • Huletski, A., Kartashov, D., Krinkin, K., Evaluation of the modern visual SLAM methods, Artificial Intelligence and Natural Language and Information Extraction, Social Media and Web Search FRUCT Conference (AINL-ISMW FRUCT), (2015), 19-25, https://doi.org/10.1109/AINL-ISMW-FRUCT.2015.7382963.
  • Guçlu, O., Simultaneous Localization and Mapping Using RGB-D Sensors, (2018). Doctoral Thesis, Hacettepe University, Turkey.
  • Durrant-Whyte, H., Bailey, T., Simultaneous localization and mapping (SLAM): Part I the essential algorithms, IEEE Robot. Autom. Mag., 13 (2) (2006), 99-110, https://doi.org/10.1109/MRA.2006.1638022.
  • Chong, T. J., Tang, X. J., Leng, C. H., et al., Sensor technologies and simultaneous localization and mapping (SLAM), Procedia Compt. Sci., 76 (2015), 174-179, https://doi.org/10.1016/j.procs.2015.12.336.
  • Filipenko, M., Afanasyev, I., Comparison of various SLAM systems for mobile robot in an indoor environment, International Conference on Intelligent Systems (IS), (2018), 400-407, https://doi.org/10.1109/IS.2018.8710464.
  • Klein, G., Murray, D., Parallel tracking and mapping for small AR workspaces, 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, (2007), 225-234.
  • Mur-Artal, R., Montiel, J. M. M., Tardos, J. D., ORB-SLAM: A versatile and accurate monocular SLAM system, IEEE Trans. Robot., 31 (5) (2015), 1147-1163, https://doi.org/10.1109/TRO.2015.2463671.
  • Liang, X., Chen, H., Li, Y., Liu, Y., Visual laser-SLAM in large-scale indoor environments, International Conference on Robotics and Biomimetics, (2016), 19-24.
  • Engel, J., Schops, T., Cremers, D., LSD-SLAM: Large-scale direct monocular SLAM, European Conference on Computer Vision, (2014), 834-849, https://doi.org/10.1007/978-3-319-10605-2_54.
  • Forster, C., Pizzoli, M., Scaramuzza. D., SVO: Fast semi-direct monocular visual odometry, IEEE International Conference on Robotics and Automation (ICRA), (2014), 15-22, https://doi.org/10.1109/ICRA.2014.6906584.
  • Kerl, C., Sturm, J., Cremers, D., Dense visual SLAM for RGB-D cameras, IEEE/RSJ International Conference on Intelligent Robots and Systems, (2013), 2100-2106, https://doi.org/10.1109/IROS.2013.6696650.
  • Whelan, T., Salas-Moreno, R., Glocker, B., ElasticFusion: Real-time dense SLAM and light source estimation, Int. J. Robot. Res., 35 (14) (2016), 1697-1716, https://doi.org/10.1177/0278364916669237.
  • Sturm, J., Engelhard, N., Endres, F., A benchmark for the evaluation of RGB-D SLAM systems, International Conference on Intelligent Robots and Systems, (2012), 573-580, https://doi.org/0.1109/IROS.2012.6385773.
  • Engel, J., Usenko, V., Cremers, D., A photometrically calibrated benchmark for monocular visual odometry, (2016), arXiv:1607.02555.
  • Butler, D. J., Wulff, J., Stanley, G. B., A naturalistic open source movie for optical flow evaluation, European Conference on Computer Vision, (2012), 611-625, https://doi.org/10.1007/978-3-642-33783-3_44.
  • Peris, M., Martull, S., Maki, A., Towards a simulation driven stereo vision system, Proceedings of the 21st International Conference on Pattern Recognition, (2012), 1038-1042.
  • Silberman, N., Hoiem, D., Kohli, P., Indoor segmentation and support inference from RGBD images, European Conference on Computer Vision, (2012), 746-760.
  • Delmerico, J., Scaramuzza, D., A benchmark comparison of monocular visual-inertial odometry algorithms for flying, International Conference on Robotics and Automation (ICRA), (2018), 2502-2509, https://doi.org/10.1109/ICRA.2018.8460664.
  • Mourikis, A. I., Roumeliotis, S. I., A multi-state constraint Kalman filter for vision aided inertial navigation, Proceedings IEEE International Conference on Robotics and Automation, (2007), 3565-3572, https://doi.org/10.1109/ROBOT.2007.364024.
  • Leutenegger, S., Lynen, S., Bosse, M., Keyframe-based visual–inertial odometry using nonlinear optimization, Int. J. Robot. Res. 34 (3) (2015), 314-334, https://doi.org/10.1177/0278364914554813.
  • Bloesch, M., Omari, S., Hutter, M., Robust visual inertial odometry using a direct EKF based approach, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), (2015), 298-304, https://doi.org/10.1109/IROS.2015.7353389.
  • Qin, T., Li, P., Shen, S., Vins-mono: A robust and versatile monocular visual-inertial state estimator, IEEE Transactions on Robotics, 34 (4) (2018), 1004-1020, https://doi.org/10.1109/TRO.2018.2853729.
  • Lynen, S., Achtelik, M. W., Weiss, S., A robust and modular multi-sensor fusion approach applied to MAV navigation, IEEE/RSJ International Conference on Intelligent Robots and Systems, (2013), 3923-3929, https://doi.org/10.1109/IROS.2013.6696917.
  • Forster, C., Carlone, L., Dellaert, F., On-manifold preintegration for real-time visual inertial odometry, IEEE Trans. Robot., 33 (1) (2016), 1-21, https://doi.org/10.1109/TRO.2016.2597321.
  • Labbe, M., Michaud, F., Online global loop closure detection for large-scale multi session graph-based SLAM, IEEE/RSJ International Conference on Intelligent Robots and Systems, (2014), 2661-2666, https://doi.org/10.1109/IROS.2014.6942926.
  • Pire, T., Fischer, T., Castro, G., S-PTAM: Stereo parallel tracking and mapping. Robotics and Autonomous Systems, 93 (2017), 27-42, https://doi.org/10.1016/j.robot.2017.03.019.
  • Bescos, B., Fácil, J. M., Civera, J., DynaSLAM: Tracking, mapping, and inpainting in dynamic scenes, IEEE Robot. Aut. Let., 3 (4) (2018), 4076-4083, https://doi.org/10.1109/LRA.2018.2860039.
  • Mingachev, E., Lavrenov, R., Tsoy, T., Comparison of ROS-based monocular visual SLAM methods: DSO, LDSO, ORB-SLAM2 and DynaSLAM, International Conference on Interactive Collaborative Robotics, (2020), 222-233, https://doi.org/10.1007/978-3-030-60337-3_22.
  • Mur-Artal, R., Tardos, J. D., ORB-SLAM2: An open-source slam system for monocular, stereo, and RGB-D cameras, IEEE Trans. Robot., 33 (5) (2017), 1255-1262, https://doi.org/10.1109/TRO.2017.2705103.
  • Engel, J., Koltun, V., Cremers, D. Direct sparse odometry, IEEE Trans. Pattern Anal. Mach. Intell., 40 (3) (2018), 611-625, https://doi.org/10.1109/TPAMI.2017.2658577.
  • Gao, X., Wang, R., Demmel, N., Cremers, D., LDSO: Direct sparse odometry with loop closure, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), (2018), 2198-2204.
  • Grupp, M., evo: Python package for the evaluation of odometry and SLAM, (2021). Available at: https://github.com/MichaelGrupp/evo.
  • Geiger, A., Lenz, P., Stiller, C., Vision meets robotics: The KITTI dataset, Int. J. Robot. Res., 32 (11) (2013), 1231-1237, https://doi.org/10.1177/0278364913491297.
  • Handa, A., Whelan. T., Mcdonald, J., A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM, IEEE International Conference on Robotics and Automation (ICRA), (2014), 1524-1531, https://doi.org/10.1109/ICRA.2014.6907054.

A comprehensive computational cost analysis for state-of-the-art visual slam methods for autonomous mapping

Year 2023, Volume: 65 Issue: 1, 1 - 15, 03.06.2023
https://doi.org/10.33769/aupse.1195379

Abstract

It is important to solve the autonomous mapping problem with high accuracy using limited energy resources in an environment without prior knowledge and/or signal. Visual Simultaneous Localization and Mapping (SLAM) deals with the problem of determining the position and orientation of an autonomous vehicle or robot with various on-board sensors, and simultaneously creating a map of environment with low energy consumption. However visual SLAM methods require high processing performance for real-time operations. Also, processing capability of the hardware is limited by the power constraints. Therefore, it is necessary to compare the processing load and power consumption of visual SLAM methods for autonomous vehicles or robots. For visual SLAM methods, although there are different comparison studies, there is no comprehensive computational cost analysis covering different datasets and important parameters including absolute trajectory error, RAM Usage, CPU load, GPU load, with total power consumption. In this paper, ORB-SLAM2, Direct Sparse Odometry (DSO), and DSO with Loop Closure (LDSO), which are state of the art visual SLAM methods, are compared. Besides the performance of these methods, energy consumption and resource usage are evaluated allowing the selection of the appropriate SLAM method.

References

  • Yousif, K., Bab-Hadiashar, A., Hoseinnezhad, R., An overview to visual odometry and visual SLAM: Applications to mobile robotics, Intell. Ind. Syst., 1 (4) (2015), 289-311, https://doi.org/10.1007/s40903-015-0032-7.
  • Huletski, A., Kartashov, D., Krinkin, K., Evaluation of the modern visual SLAM methods, Artificial Intelligence and Natural Language and Information Extraction, Social Media and Web Search FRUCT Conference (AINL-ISMW FRUCT), (2015), 19-25, https://doi.org/10.1109/AINL-ISMW-FRUCT.2015.7382963.
  • Guçlu, O., Simultaneous Localization and Mapping Using RGB-D Sensors, (2018). Doctoral Thesis, Hacettepe University, Turkey.
  • Durrant-Whyte, H., Bailey, T., Simultaneous localization and mapping (SLAM): Part I the essential algorithms, IEEE Robot. Autom. Mag., 13 (2) (2006), 99-110, https://doi.org/10.1109/MRA.2006.1638022.
  • Chong, T. J., Tang, X. J., Leng, C. H., et al., Sensor technologies and simultaneous localization and mapping (SLAM), Procedia Compt. Sci., 76 (2015), 174-179, https://doi.org/10.1016/j.procs.2015.12.336.
  • Filipenko, M., Afanasyev, I., Comparison of various SLAM systems for mobile robot in an indoor environment, International Conference on Intelligent Systems (IS), (2018), 400-407, https://doi.org/10.1109/IS.2018.8710464.
  • Klein, G., Murray, D., Parallel tracking and mapping for small AR workspaces, 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, (2007), 225-234.
  • Mur-Artal, R., Montiel, J. M. M., Tardos, J. D., ORB-SLAM: A versatile and accurate monocular SLAM system, IEEE Trans. Robot., 31 (5) (2015), 1147-1163, https://doi.org/10.1109/TRO.2015.2463671.
  • Liang, X., Chen, H., Li, Y., Liu, Y., Visual laser-SLAM in large-scale indoor environments, International Conference on Robotics and Biomimetics, (2016), 19-24.
  • Engel, J., Schops, T., Cremers, D., LSD-SLAM: Large-scale direct monocular SLAM, European Conference on Computer Vision, (2014), 834-849, https://doi.org/10.1007/978-3-319-10605-2_54.
  • Forster, C., Pizzoli, M., Scaramuzza. D., SVO: Fast semi-direct monocular visual odometry, IEEE International Conference on Robotics and Automation (ICRA), (2014), 15-22, https://doi.org/10.1109/ICRA.2014.6906584.
  • Kerl, C., Sturm, J., Cremers, D., Dense visual SLAM for RGB-D cameras, IEEE/RSJ International Conference on Intelligent Robots and Systems, (2013), 2100-2106, https://doi.org/10.1109/IROS.2013.6696650.
  • Whelan, T., Salas-Moreno, R., Glocker, B., ElasticFusion: Real-time dense SLAM and light source estimation, Int. J. Robot. Res., 35 (14) (2016), 1697-1716, https://doi.org/10.1177/0278364916669237.
  • Sturm, J., Engelhard, N., Endres, F., A benchmark for the evaluation of RGB-D SLAM systems, International Conference on Intelligent Robots and Systems, (2012), 573-580, https://doi.org/0.1109/IROS.2012.6385773.
  • Engel, J., Usenko, V., Cremers, D., A photometrically calibrated benchmark for monocular visual odometry, (2016), arXiv:1607.02555.
  • Butler, D. J., Wulff, J., Stanley, G. B., A naturalistic open source movie for optical flow evaluation, European Conference on Computer Vision, (2012), 611-625, https://doi.org/10.1007/978-3-642-33783-3_44.
  • Peris, M., Martull, S., Maki, A., Towards a simulation driven stereo vision system, Proceedings of the 21st International Conference on Pattern Recognition, (2012), 1038-1042.
  • Silberman, N., Hoiem, D., Kohli, P., Indoor segmentation and support inference from RGBD images, European Conference on Computer Vision, (2012), 746-760.
  • Delmerico, J., Scaramuzza, D., A benchmark comparison of monocular visual-inertial odometry algorithms for flying, International Conference on Robotics and Automation (ICRA), (2018), 2502-2509, https://doi.org/10.1109/ICRA.2018.8460664.
  • Mourikis, A. I., Roumeliotis, S. I., A multi-state constraint Kalman filter for vision aided inertial navigation, Proceedings IEEE International Conference on Robotics and Automation, (2007), 3565-3572, https://doi.org/10.1109/ROBOT.2007.364024.
  • Leutenegger, S., Lynen, S., Bosse, M., Keyframe-based visual–inertial odometry using nonlinear optimization, Int. J. Robot. Res. 34 (3) (2015), 314-334, https://doi.org/10.1177/0278364914554813.
  • Bloesch, M., Omari, S., Hutter, M., Robust visual inertial odometry using a direct EKF based approach, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), (2015), 298-304, https://doi.org/10.1109/IROS.2015.7353389.
  • Qin, T., Li, P., Shen, S., Vins-mono: A robust and versatile monocular visual-inertial state estimator, IEEE Transactions on Robotics, 34 (4) (2018), 1004-1020, https://doi.org/10.1109/TRO.2018.2853729.
  • Lynen, S., Achtelik, M. W., Weiss, S., A robust and modular multi-sensor fusion approach applied to MAV navigation, IEEE/RSJ International Conference on Intelligent Robots and Systems, (2013), 3923-3929, https://doi.org/10.1109/IROS.2013.6696917.
  • Forster, C., Carlone, L., Dellaert, F., On-manifold preintegration for real-time visual inertial odometry, IEEE Trans. Robot., 33 (1) (2016), 1-21, https://doi.org/10.1109/TRO.2016.2597321.
  • Labbe, M., Michaud, F., Online global loop closure detection for large-scale multi session graph-based SLAM, IEEE/RSJ International Conference on Intelligent Robots and Systems, (2014), 2661-2666, https://doi.org/10.1109/IROS.2014.6942926.
  • Pire, T., Fischer, T., Castro, G., S-PTAM: Stereo parallel tracking and mapping. Robotics and Autonomous Systems, 93 (2017), 27-42, https://doi.org/10.1016/j.robot.2017.03.019.
  • Bescos, B., Fácil, J. M., Civera, J., DynaSLAM: Tracking, mapping, and inpainting in dynamic scenes, IEEE Robot. Aut. Let., 3 (4) (2018), 4076-4083, https://doi.org/10.1109/LRA.2018.2860039.
  • Mingachev, E., Lavrenov, R., Tsoy, T., Comparison of ROS-based monocular visual SLAM methods: DSO, LDSO, ORB-SLAM2 and DynaSLAM, International Conference on Interactive Collaborative Robotics, (2020), 222-233, https://doi.org/10.1007/978-3-030-60337-3_22.
  • Mur-Artal, R., Tardos, J. D., ORB-SLAM2: An open-source slam system for monocular, stereo, and RGB-D cameras, IEEE Trans. Robot., 33 (5) (2017), 1255-1262, https://doi.org/10.1109/TRO.2017.2705103.
  • Engel, J., Koltun, V., Cremers, D. Direct sparse odometry, IEEE Trans. Pattern Anal. Mach. Intell., 40 (3) (2018), 611-625, https://doi.org/10.1109/TPAMI.2017.2658577.
  • Gao, X., Wang, R., Demmel, N., Cremers, D., LDSO: Direct sparse odometry with loop closure, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), (2018), 2198-2204.
  • Grupp, M., evo: Python package for the evaluation of odometry and SLAM, (2021). Available at: https://github.com/MichaelGrupp/evo.
  • Geiger, A., Lenz, P., Stiller, C., Vision meets robotics: The KITTI dataset, Int. J. Robot. Res., 32 (11) (2013), 1231-1237, https://doi.org/10.1177/0278364913491297.
  • Handa, A., Whelan. T., Mcdonald, J., A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM, IEEE International Conference on Robotics and Automation (ICRA), (2014), 1524-1531, https://doi.org/10.1109/ICRA.2014.6907054.
There are 35 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Research Articles
Authors

Ömer Faruk Yanık 0000-0001-7832-1945

Hakki Alparslan Ilgın 0000-0003-0112-4833

Early Pub Date May 17, 2023
Publication Date June 3, 2023
Submission Date October 27, 2022
Acceptance Date November 9, 2022
Published in Issue Year 2023 Volume: 65 Issue: 1

Cite

APA Yanık, Ö. F., & Ilgın, H. A. (2023). A comprehensive computational cost analysis for state-of-the-art visual slam methods for autonomous mapping. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering, 65(1), 1-15. https://doi.org/10.33769/aupse.1195379
AMA Yanık ÖF, Ilgın HA. A comprehensive computational cost analysis for state-of-the-art visual slam methods for autonomous mapping. Commun.Fac.Sci.Univ.Ank.Series A2-A3: Phys.Sci. and Eng. June 2023;65(1):1-15. doi:10.33769/aupse.1195379
Chicago Yanık, Ömer Faruk, and Hakki Alparslan Ilgın. “A Comprehensive Computational Cost Analysis for State-of-the-Art Visual Slam Methods for Autonomous Mapping”. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering 65, no. 1 (June 2023): 1-15. https://doi.org/10.33769/aupse.1195379.
EndNote Yanık ÖF, Ilgın HA (June 1, 2023) A comprehensive computational cost analysis for state-of-the-art visual slam methods for autonomous mapping. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering 65 1 1–15.
IEEE Ö. F. Yanık and H. A. Ilgın, “A comprehensive computational cost analysis for state-of-the-art visual slam methods for autonomous mapping”, Commun.Fac.Sci.Univ.Ank.Series A2-A3: Phys.Sci. and Eng., vol. 65, no. 1, pp. 1–15, 2023, doi: 10.33769/aupse.1195379.
ISNAD Yanık, Ömer Faruk - Ilgın, Hakki Alparslan. “A Comprehensive Computational Cost Analysis for State-of-the-Art Visual Slam Methods for Autonomous Mapping”. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering 65/1 (June 2023), 1-15. https://doi.org/10.33769/aupse.1195379.
JAMA Yanık ÖF, Ilgın HA. A comprehensive computational cost analysis for state-of-the-art visual slam methods for autonomous mapping. Commun.Fac.Sci.Univ.Ank.Series A2-A3: Phys.Sci. and Eng. 2023;65:1–15.
MLA Yanık, Ömer Faruk and Hakki Alparslan Ilgın. “A Comprehensive Computational Cost Analysis for State-of-the-Art Visual Slam Methods for Autonomous Mapping”. Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering, vol. 65, no. 1, 2023, pp. 1-15, doi:10.33769/aupse.1195379.
Vancouver Yanık ÖF, Ilgın HA. A comprehensive computational cost analysis for state-of-the-art visual slam methods for autonomous mapping. Commun.Fac.Sci.Univ.Ank.Series A2-A3: Phys.Sci. and Eng. 2023;65(1):1-15.

Communications Faculty of Sciences University of Ankara Series A2-A3 Physical Sciences and Engineering

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.