Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2024, , 384 - 392, 29.06.2024
https://doi.org/10.17798/bitlisfen.1341929

Öz

Kaynakça

  • [1] G. Boztaş and Ö. Aydoğmuş, “Implementation of Pure Pursuit Algorithm for Nonholonomic Mobile Robot using Robot Operating System,” Balkan Journal of Electrical and Computer Engineering, vol. 9, no. 4, pp. 337–341, Oct. 2021, doi: 10.17694/bajece.983350.
  • [2] M. Li et al., “Design and analysis of a whole-body controller for a velocity controlled robot mobile manipulator,” Science China Information Sciences, vol. 63, no. 7, Jul. 2020, doi: 10.1007/s11432-019-2741-6.
  • [3] A. Saenz, V. Santibañez, E. Bugarin, A. Dzul, H. Ríos, and J. Villalobos-Chin, “Velocity Control of an Omnidirectional Wheeled Mobile Robot Using Computed Voltage Control with Visual Feedback: Experimental Results,” Int J Control Autom Syst, vol. 19, no. 2, pp. 1089–1102, Feb. 2021, doi: 10.1007/s12555-019-1057-6.
  • [4] F. Huo, S. Zhu, H. Dong, and W. Ren, “A new approach to smooth path planning of Ackerman mobile robot based on improved ACO algorithm and B-spline curve,” Rob Auton Syst, vol. 175, May 2024, doi: 10.1016/j.robot.2024.104655.
  • [5] M. U. Shafiq et al., “Real-time navigation of mecanum wheel-based mobile robot in a dynamic environment,” Heliyon, vol. 10, no. 5, Mar. 2024, doi: 10.1016/j.heliyon.2024.e26829.
  • [6] B. Lakshmipriya, B. Pottakkat, and G. Ramkumar, “Deep learning techniques in liver tumour diagnosis using CT and MR imaging - A systematic review,” Artificial Intelligence in Medicine, vol. 141. Elsevier B.V., Jul. 01, 2023. doi: 10.1016/j.artmed.2023.102557.
  • [7] M. C. Bingol and O. Aydogmus, “Practical application of a safe human-robot interaction software,” Industrial Robot, vol. 47, no. 3, pp. 359–368, May 2020, doi: 10.1108/IR-09-2019-0180.
  • [8] J. G. Choi, D. C. Kim, M. Chung, S. Lim, and H. W. Park, “Multimodal 1D CNN for delamination prediction in CFRP drilling process with industrial robots,” Comput Ind Eng, vol. 190, Apr. 2024, doi: 10.1016/j.cie.2024.110074.
  • [9] J. Park, M. B. G. Jun, and H. Yun, “Development of robotic bin picking platform with cluttered objects using human guidance and convolutional neural network (CNN),” J Manuf Syst, vol. 63, pp. 539–549, Apr. 2022, doi: 10.1016/j.jmsy.2022.05.011.
  • [10] C. Cruz Ulloa, A. Krus, A. Barrientos, J. del Cerro, and C. Valero, “Robotic Fertilization in Strip Cropping using a CNN Vegetables Detection-Characterization Method,” Comput Electron Agric, vol. 193, Feb. 2022, doi: 10.1016/j.compag.2022.106684.
  • [11] S. Kim and S. Lee, “Robustness analysis of mobile robot velocity estimation using a regular polygonal array of optical M,” in IFAC Proceedings Volumes (IFAC-PapersOnline), 2008. doi: 10.3182/20080706-5-KR-1001.0769.
  • [12] X. Yu et al., “Fully Proprioceptive Slip-Velocity-Aware State Estimation for Mobile Robots via Invariant Kalman Filtering and Disturbance Observer,” Sep. 2022, [Online]. Available: http://arxiv.org/abs/2209.15140
  • [13] M. A. Arteaga-Pérez and E. Nuño, “Velocity observer design for the consensus in delayed robot networks,” J Franklin Inst, vol. 355, no. 14, pp. 6810–6829, Sep. 2018, doi: 10.1016/j.jfranklin.2018.07.001.

Layer Selection for Subtraction and Concatenation: A Method for Visual Velocity Estimation of a Mobile Robot

Yıl 2024, , 384 - 392, 29.06.2024
https://doi.org/10.17798/bitlisfen.1341929

Öz

Kinematic information such as position, velocity, and acceleration is critical to determine the three-dimensional state of the robot in space. In this study, it is aimed to estimate as visual the linear and angular velocity of a mobile robot. Additionally, another aim of this study is to determine the suitability of the concatenation or subtraction layer in the Convolutional Neural Network (CNN) that will make this estimate. For these purposes, first, a simulation environment was created. 9000 pairs of images and necessary velocity information were collected from this simulation environment for training. Similarly, 1000 pairs of images and velocity information were gathered for validation. Four different CNN models were designed and these models were trained and tested using these datasets. As a result of the test, the lowest average error for linear velocity estimation was calculated as 0.93e-3m/s and angular velocity estimation was measured as 4.37e-3rad/s. It was observed that the results were sufficient for linear and angular velocity prediction according to statistical analysis of errors. In addition, it was observed that the subtraction layer can be used instead of the concatenation layer in the CNN architectures for hardware-limited systems. As a result, visual velocity estimation of mobile robots has been achieved with this study and the framework of CNN models has been drawn for this problem.

Kaynakça

  • [1] G. Boztaş and Ö. Aydoğmuş, “Implementation of Pure Pursuit Algorithm for Nonholonomic Mobile Robot using Robot Operating System,” Balkan Journal of Electrical and Computer Engineering, vol. 9, no. 4, pp. 337–341, Oct. 2021, doi: 10.17694/bajece.983350.
  • [2] M. Li et al., “Design and analysis of a whole-body controller for a velocity controlled robot mobile manipulator,” Science China Information Sciences, vol. 63, no. 7, Jul. 2020, doi: 10.1007/s11432-019-2741-6.
  • [3] A. Saenz, V. Santibañez, E. Bugarin, A. Dzul, H. Ríos, and J. Villalobos-Chin, “Velocity Control of an Omnidirectional Wheeled Mobile Robot Using Computed Voltage Control with Visual Feedback: Experimental Results,” Int J Control Autom Syst, vol. 19, no. 2, pp. 1089–1102, Feb. 2021, doi: 10.1007/s12555-019-1057-6.
  • [4] F. Huo, S. Zhu, H. Dong, and W. Ren, “A new approach to smooth path planning of Ackerman mobile robot based on improved ACO algorithm and B-spline curve,” Rob Auton Syst, vol. 175, May 2024, doi: 10.1016/j.robot.2024.104655.
  • [5] M. U. Shafiq et al., “Real-time navigation of mecanum wheel-based mobile robot in a dynamic environment,” Heliyon, vol. 10, no. 5, Mar. 2024, doi: 10.1016/j.heliyon.2024.e26829.
  • [6] B. Lakshmipriya, B. Pottakkat, and G. Ramkumar, “Deep learning techniques in liver tumour diagnosis using CT and MR imaging - A systematic review,” Artificial Intelligence in Medicine, vol. 141. Elsevier B.V., Jul. 01, 2023. doi: 10.1016/j.artmed.2023.102557.
  • [7] M. C. Bingol and O. Aydogmus, “Practical application of a safe human-robot interaction software,” Industrial Robot, vol. 47, no. 3, pp. 359–368, May 2020, doi: 10.1108/IR-09-2019-0180.
  • [8] J. G. Choi, D. C. Kim, M. Chung, S. Lim, and H. W. Park, “Multimodal 1D CNN for delamination prediction in CFRP drilling process with industrial robots,” Comput Ind Eng, vol. 190, Apr. 2024, doi: 10.1016/j.cie.2024.110074.
  • [9] J. Park, M. B. G. Jun, and H. Yun, “Development of robotic bin picking platform with cluttered objects using human guidance and convolutional neural network (CNN),” J Manuf Syst, vol. 63, pp. 539–549, Apr. 2022, doi: 10.1016/j.jmsy.2022.05.011.
  • [10] C. Cruz Ulloa, A. Krus, A. Barrientos, J. del Cerro, and C. Valero, “Robotic Fertilization in Strip Cropping using a CNN Vegetables Detection-Characterization Method,” Comput Electron Agric, vol. 193, Feb. 2022, doi: 10.1016/j.compag.2022.106684.
  • [11] S. Kim and S. Lee, “Robustness analysis of mobile robot velocity estimation using a regular polygonal array of optical M,” in IFAC Proceedings Volumes (IFAC-PapersOnline), 2008. doi: 10.3182/20080706-5-KR-1001.0769.
  • [12] X. Yu et al., “Fully Proprioceptive Slip-Velocity-Aware State Estimation for Mobile Robots via Invariant Kalman Filtering and Disturbance Observer,” Sep. 2022, [Online]. Available: http://arxiv.org/abs/2209.15140
  • [13] M. A. Arteaga-Pérez and E. Nuño, “Velocity observer design for the consensus in delayed robot networks,” J Franklin Inst, vol. 355, no. 14, pp. 6810–6829, Sep. 2018, doi: 10.1016/j.jfranklin.2018.07.001.
Toplam 13 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Mekatronik Mühendisliği
Bölüm Araştırma Makalesi
Yazarlar

Mustafa Can Bıngol 0000-0001-5448-8281

Erken Görünüm Tarihi 27 Haziran 2024
Yayımlanma Tarihi 29 Haziran 2024
Gönderilme Tarihi 12 Ağustos 2023
Kabul Tarihi 7 Haziran 2024
Yayımlandığı Sayı Yıl 2024

Kaynak Göster

IEEE M. C. Bıngol, “Layer Selection for Subtraction and Concatenation: A Method for Visual Velocity Estimation of a Mobile Robot”, Bitlis Eren Üniversitesi Fen Bilimleri Dergisi, c. 13, sy. 2, ss. 384–392, 2024, doi: 10.17798/bitlisfen.1341929.



Bitlis Eren Üniversitesi
Fen Bilimleri Dergisi Editörlüğü

Bitlis Eren Üniversitesi Lisansüstü Eğitim Enstitüsü        
Beş Minare Mah. Ahmet Eren Bulvarı, Merkez Kampüs, 13000 BİTLİS        
E-posta: fbe@beu.edu.tr