Research Article
BibTex RIS Cite

Unmanned aerial vehicles precision landing on a moving platform using image matrix segmentation method

Year 2024, Volume: 4 Issue: 1, 1 - 12, 30.06.2024

Abstract

This paper presented the theory, planning, control and method for Unmanned Aerial Vehicles performing precision landings on a moving platform. Unmanned Aerial Vehicles performing flight mission often having issues of its retrieving due to the landing sequence inaccuracy which may lead to crash. Thus, in this paper, by using adaptive method and image processing, the H480 hexacopter, equipped with a gimbaled camera detecting the moving platform attached to a ground rover using a pattern recognition algorithm. Using AprilTag as the unique pattern, the H480 follows the moving platform and moves via pitch and roll instructions while constantly descending towards the ground. In this paper, the system proposed the degree of pitch and roll changes with regards to the position of the AprilTag i.e., the further the tag location detected from the camera center, the higher the degree of movement, such that the tag will be forced to be in the center of the camera frame. The system divides a camera frame into an 11x11 matrix in which each cell within the matrix suggests different pitch and roll degrees for the H480 movement. As a result, the system manages to assist the landing process for the H480 to reach the moving platform successfully with less than 0.5m offset from the center of the target.

Supporting Institution

Collaborative Research in Engineering, Science and Technology(CREST) and UiTM Pulau Pinang

Project Number

T04C2-17

References

  • [1] Al-Kaff, A., Martín, D., García, F., de la Escalera, A., & María Armingol, J.M., Survey of computer vision algorithms and applications for unmanned aerial vehicles, Expert Systems with Applications, 2018, 92, 447–463. DOI: 10.1016/j.eswa.2017.09.033.
  • [2] Hasegawa, A., Goto, T., & Hirano, S. Accuracy improvement of image recognition by contrast correction for autonomous drone flights, IEEE 7th Global Conference on Consumer Electronics (GCCE), Nara, Japan, Oct. 2018, pp. 45–46. DOI: 10.1109/GCCE.2018.8574487.
  • [3] Xu, G., Zhang, Y., Ji, S., Cheng, Y., & Tian, Y. Research on computer vision-based for UAV autonomous landing on a ship, Pattern Recognition Letters, 2009; 30(6), 600-605. DOI: 10.1016/j.patrec.2008.12.011.
  • [4] Joo, S., Al-Ali, K., Ippolito, C., & Yeh, Y.H. Towards autonomous fixed-wing unmanned aerial vehicle landing: A vision-aided inertial navigation under sensor reconfiguration scenario, IFAC Proceedings Volumes, 2008: 41(2) 13677-13682, DOI: 10.3182/20080706-5-KR-1001.02316.
  • [5] Marcon, P., Janousek, J., & Kadlec, R. Vision-based and differential global positioning system to ensure precise autonomous landing of UAVs,” 2018 Progress in Electromagnetics Research Symposium, Toyama, 2018: 542–546. DOI: 10.23919/PIERS.2018.8598179.
  • [6] Jiang, T., Lin, D., & Song, T. Vision-based autonomous landing of a quadrotor using a gimbaled camera, Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, 2019; 233(14); 5093-5106, DOI: 10.1177/0954410019837777.
  • [7] Serra, P., Cunha, R. Hamel, T. Cabecinhas, D., & Silvestre, C. Landing of a quadrotor on a moving target using dynamic image-based visual servo control, IEEE Trans. Robot, 2016; 32(6); 1524–1535, DOI: 10.1109/TRO.2016.2604495.
  • [8] Olson, E. AprilTag: A robust and flexible visual fiducial system, 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, May 2011; 3400-3407. DOI: 10.1109/ICRA.2011.5979561.
  • [9] Feng, Y., Zhang, C., Baek, S., Rawashdeh, S., & Mohammadi, A. Autonomous landing of a uav on a moving platform using model predictive control, Drones, 2018; 2(4); 34, DOI: 10.3390/drones2040034.
  • [10] Jiang, Z., Liu, Y., Wu, B., & Zhu, Q. Monocular vision based UAV target detection and ranging system implemented on OpenCV and Tensor Flow, in 2019 18th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES), Wuhan, China, Nov. 2019; 88-91. DOI: 10.1109/DCABES48411.2019.00029.
  • [11] Kallwies, J., Forkel, B. & Wuensche, H.J. Determining and improving the localization accuracy of AprilTag detection, in 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, May 2020; 8288-8294. DOI: 10.1109/ICRA40945.2020.9197427.
  • [12] Krishna, N. Camera calibration with example in python,” Medium, Feb. 07, 2022. https://towardsdatascience.com/camera-calibration-with-example-in-python-5147e945cdeb (accessed in May 17, 2022).
  • [13] Trujillo, J.C., Munguia, R., Guerra, E., & Grau, A. Cooperative monocular-based SLAM for multi-UAV systems in GPS-Denied environments, Sensors, 2018; 18(5); 1351, DOI: 10.3390/s18051351.
  • [14] Perumal, L. “Euler angles: conversion of arbitrary rotation sequences to specific rotation sequence: Conversion of rotation sequences,” Comp. Anim. Virtual Worlds, vol. 25, no. 5–6, pp. 521–529, Sep. 2014, doi: 10.1002/cav.1529.
  • [15] Artale, V., Milazzo, C.L.R., & Ricciardello, A. A quaternion-based simulation of multirotor dynamics, Int. J. Model. Simul. Sci. Comput., 2015; 6(1); 1550009. DOI: 10.1142/S1793962315500099.
  • [16] Simulation, PX4 User Guide. [Online]. Available: https://docs.px4.io/v1.12/en/simulation/. [Accessed: 06-Sep-2022]
  • [17] Zhang, L. et al. Vision-based target three-dimensional geolocation using unmanned aerial vehicles,” IEEE Trans. Ind. Electron., 2018; 65(10); 8052-8061.
  • [18] Xu, G., Zhang, Y., Ji, S., Cheng, Y., & Tian, Y. Research on computer vision-based for UAV autonomous landing on a ship,” Pattern Recognit. Lett., 2009; 30(6); 600-605
Year 2024, Volume: 4 Issue: 1, 1 - 12, 30.06.2024

Abstract

Project Number

T04C2-17

References

  • [1] Al-Kaff, A., Martín, D., García, F., de la Escalera, A., & María Armingol, J.M., Survey of computer vision algorithms and applications for unmanned aerial vehicles, Expert Systems with Applications, 2018, 92, 447–463. DOI: 10.1016/j.eswa.2017.09.033.
  • [2] Hasegawa, A., Goto, T., & Hirano, S. Accuracy improvement of image recognition by contrast correction for autonomous drone flights, IEEE 7th Global Conference on Consumer Electronics (GCCE), Nara, Japan, Oct. 2018, pp. 45–46. DOI: 10.1109/GCCE.2018.8574487.
  • [3] Xu, G., Zhang, Y., Ji, S., Cheng, Y., & Tian, Y. Research on computer vision-based for UAV autonomous landing on a ship, Pattern Recognition Letters, 2009; 30(6), 600-605. DOI: 10.1016/j.patrec.2008.12.011.
  • [4] Joo, S., Al-Ali, K., Ippolito, C., & Yeh, Y.H. Towards autonomous fixed-wing unmanned aerial vehicle landing: A vision-aided inertial navigation under sensor reconfiguration scenario, IFAC Proceedings Volumes, 2008: 41(2) 13677-13682, DOI: 10.3182/20080706-5-KR-1001.02316.
  • [5] Marcon, P., Janousek, J., & Kadlec, R. Vision-based and differential global positioning system to ensure precise autonomous landing of UAVs,” 2018 Progress in Electromagnetics Research Symposium, Toyama, 2018: 542–546. DOI: 10.23919/PIERS.2018.8598179.
  • [6] Jiang, T., Lin, D., & Song, T. Vision-based autonomous landing of a quadrotor using a gimbaled camera, Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, 2019; 233(14); 5093-5106, DOI: 10.1177/0954410019837777.
  • [7] Serra, P., Cunha, R. Hamel, T. Cabecinhas, D., & Silvestre, C. Landing of a quadrotor on a moving target using dynamic image-based visual servo control, IEEE Trans. Robot, 2016; 32(6); 1524–1535, DOI: 10.1109/TRO.2016.2604495.
  • [8] Olson, E. AprilTag: A robust and flexible visual fiducial system, 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, May 2011; 3400-3407. DOI: 10.1109/ICRA.2011.5979561.
  • [9] Feng, Y., Zhang, C., Baek, S., Rawashdeh, S., & Mohammadi, A. Autonomous landing of a uav on a moving platform using model predictive control, Drones, 2018; 2(4); 34, DOI: 10.3390/drones2040034.
  • [10] Jiang, Z., Liu, Y., Wu, B., & Zhu, Q. Monocular vision based UAV target detection and ranging system implemented on OpenCV and Tensor Flow, in 2019 18th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES), Wuhan, China, Nov. 2019; 88-91. DOI: 10.1109/DCABES48411.2019.00029.
  • [11] Kallwies, J., Forkel, B. & Wuensche, H.J. Determining and improving the localization accuracy of AprilTag detection, in 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, May 2020; 8288-8294. DOI: 10.1109/ICRA40945.2020.9197427.
  • [12] Krishna, N. Camera calibration with example in python,” Medium, Feb. 07, 2022. https://towardsdatascience.com/camera-calibration-with-example-in-python-5147e945cdeb (accessed in May 17, 2022).
  • [13] Trujillo, J.C., Munguia, R., Guerra, E., & Grau, A. Cooperative monocular-based SLAM for multi-UAV systems in GPS-Denied environments, Sensors, 2018; 18(5); 1351, DOI: 10.3390/s18051351.
  • [14] Perumal, L. “Euler angles: conversion of arbitrary rotation sequences to specific rotation sequence: Conversion of rotation sequences,” Comp. Anim. Virtual Worlds, vol. 25, no. 5–6, pp. 521–529, Sep. 2014, doi: 10.1002/cav.1529.
  • [15] Artale, V., Milazzo, C.L.R., & Ricciardello, A. A quaternion-based simulation of multirotor dynamics, Int. J. Model. Simul. Sci. Comput., 2015; 6(1); 1550009. DOI: 10.1142/S1793962315500099.
  • [16] Simulation, PX4 User Guide. [Online]. Available: https://docs.px4.io/v1.12/en/simulation/. [Accessed: 06-Sep-2022]
  • [17] Zhang, L. et al. Vision-based target three-dimensional geolocation using unmanned aerial vehicles,” IEEE Trans. Ind. Electron., 2018; 65(10); 8052-8061.
  • [18] Xu, G., Zhang, Y., Ji, S., Cheng, Y., & Tian, Y. Research on computer vision-based for UAV autonomous landing on a ship,” Pattern Recognit. Lett., 2009; 30(6); 600-605
There are 18 citations in total.

Details

Primary Language English
Subjects Software Engineering, Software Architecture
Journal Section Research Articles
Authors

Muhammad Farris Khyasudeen 0000-0001-8224-425X

Muhammad Badri Mhd Noor This is me 0000-0001-9433-632X

Saaidal Razalli Azzuhri This is me 0000-0001-8603-8840

Norlida Buniyamin This is me 0000-0002-4256-6809

Project Number T04C2-17
Early Pub Date December 1, 2023
Publication Date June 30, 2024
Acceptance Date July 30, 2023
Published in Issue Year 2024 Volume: 4 Issue: 1

Cite

Vancouver Khyasudeen MF, Mhd Noor MB, Azzuhri SR, Buniyamin N. Unmanned aerial vehicles precision landing on a moving platform using image matrix segmentation method. Computers and Informatics. 2024;4(1):1-12.