Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2023, Cilt: 11 Sayı: 2, 41 - 46, 18.05.2023
https://doi.org/10.21541/apjess.1297168

Öz

Destekleyen Kurum

OPTİMAK SÜREÇ TASARIMI VE UYGULAMALARI A.Ş

Teşekkür

Sağlamış olduğu çalışma ortamı ve ürün desteklerinden dolayı OPTİMAK STU A.Ş firmasına teşekkürlerimizi sunarız.

Kaynakça

  • Ohio State University, “How the brain sees the world in 3-D: Scientists record visual cortex combining 2-D and depth info,” ScienceDaily, 2017. %3Cwww.sciencedaily.com/releases/2017/03/170321110312.htm%3E
  • M. S. Ahn, H. Chae, D. Noh, H. Nam, and D. Hong, “Analysis and Noise Modeling of the Intel RealSense D435 for Mobile Robots,” 2019. doi: 10.1109/URAI.2019.8768489.
  • S. Lee, “Depth camera image processing and applications,” in 2012 19th IEEE International Conference on Image Processing, 2012, pp. 545–548.
  • J.-H. Cho, I.-Y. Chang, S. Kim, and K. H. Lee, “Depth image processing technique for representing human actors in 3DTV using single depth camera,” in 2007 3DTV Conference, 2007, pp. 1–4.
  • J.-S. Jeong, K.-C. Kwon, M.-U. Erdenebat, Y. Piao, N. Kim, and K.-H. Yoo, “Development of a real-time integral imaging display system based on graphics processing unit parallel processing using a depth camera,” Opt. Eng., vol. 53, no. 1, p. 15103, 2014.
  • K. Adi and C. E. Widodo, “Distance Measurement With a Stereo Camera,” Int. J. Innov. Res. Adv. Eng., 2017.
  • J. B. Kim, “Efficient vehicle detection and distance estimation based on aggregated channel features and inverse perspective mapping from a single camera,” Symmetry (Basel)., 2019, doi: 10.3390/sym11101205.
  • G. Lin, Y. Tang, X. Zou, J. Xiong, and J. Li, “Guava detection and pose estimation using a low-cost RGB-D sensor in the field,” Sensors (Switzerland), 2019, doi: 10.3390/s19020428.
  • B. Parr, M. Legg, and F. Alam, “Analysis of Depth Cameras for Proximal Sensing of Grapes,” Sensors, vol. 22, no. 11, 2022, doi: 10.3390/s22114179.
  • P. Kurtser, O. Ringdahl, N. Rotstein, R. Berenstein, and Y. Edan, “In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level RGB-D Camera,” IEEE Robot. Autom. Lett., 2020, doi: 10.1109/LRA.2020.2970654.
  • Z. Wang, K. B. Walsh, and B. Verma, “On-tree mango fruit size estimation using RGB-D images,” Sensors (Switzerland), 2017, doi: 10.3390/s17122738.
  • N. Maeda, H. Suzuki, T. Kitajima, A. Kuwahara, and T. Yasuno, “Measurement of Tomato Leaf Area Using Depth Camera,” J. Signal Process., vol. 26, no. 4, pp. 123–126, 2022.
  • B. Zheng, G. Sun, Z. Meng, and R. Nan, “Vegetable Size Measurement Based on Stereo Camera and Keypoints Detection,” Sensors, vol. 22, no. 4, 2022, doi: 10.3390/s22041617.
  • C. Vo-Le, P. Van Muoi, N. H. Son, N. Van San, V. K. Duong, and N. T. Huyen, “Automatic Method for Measuring Object Size Using 3D Camera,” ICCE 2020 - 2020 IEEE 8th Int. Conf. Commun. Electron., pp. 365–369, 2021, doi: 10.1109/ICCE48956.2021.9352115.
  • H. Xu, J. Xu, and W. Xu, “Survey of 3D modeling using depth cameras,” Virtual Reality and Intelligent Hardware. 2019. doi: 10.1016/j.vrih.2019.09.003.
  • L. Wu, Y. Long, H. Sun, N. Liu, W. Wang, and Q. Dong, “Length Measurement of Potato Leaf using Depth Camera,” 2018. doi: 10.1016/j.ifacol.2018.08.197.
  • A. Ruchay, V. Kober, K. Dorofeev, V. Kolpakov, and S. Miroshnikov, “Accurate body measurement of live cattle using three depth cameras and non-rigid 3-D shape recovery,” Comput. Electron. Agric., 2020, doi: 10.1016/j.compag.2020.105821.
  • B. A. Griffin and J. J. Corso, “Depth from Camera Motion and Object Detection,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 1397–1406, 2021, doi: 10.1109/CVPR46437.2021.00145.
  • L. Keselman et al., “Intel RealSense Stereoscopic Depth Cameras,” Comput. Vis. Pattern Recognit., 2017.

A Novel Method Determining the Size and Angle of an Object Using a Depth Camera Without Reference

Yıl 2023, Cilt: 11 Sayı: 2, 41 - 46, 18.05.2023
https://doi.org/10.21541/apjess.1297168

Öz

In traditional methods, a fixed object is taken as a reference for size determination. The size of the other object is calculated by comparing the dimensions of this reference object. However, when it is desired to measure objects at different heights, the measurement of the reference object must also be changed. In the study carried out, the size and angle of the products flowing through the line were determined by using a depth camera. The data set was created by taking the measurements of an object of known dimensions from 34 different distances. The mentioned data set consists of the ratio of the number of pixels of the edges of the object to the lengths of the edges. By comparing the correlation between the obtained data set and the distance values, a mathematical equation was extracted with the help of the MATLAB program. With the help of this equation, regardless of the height and color of the object, only the distance to the camera and all 3 dimensions can be calculated. In addition, the angle of the object with the software-generated reference line was calculated by taking the long side of the object as a reference. In this way, the size and angle of the products flowing through the line were determined with a single camera without the need for a reference object and without any color and size discrimination.

Kaynakça

  • Ohio State University, “How the brain sees the world in 3-D: Scientists record visual cortex combining 2-D and depth info,” ScienceDaily, 2017. %3Cwww.sciencedaily.com/releases/2017/03/170321110312.htm%3E
  • M. S. Ahn, H. Chae, D. Noh, H. Nam, and D. Hong, “Analysis and Noise Modeling of the Intel RealSense D435 for Mobile Robots,” 2019. doi: 10.1109/URAI.2019.8768489.
  • S. Lee, “Depth camera image processing and applications,” in 2012 19th IEEE International Conference on Image Processing, 2012, pp. 545–548.
  • J.-H. Cho, I.-Y. Chang, S. Kim, and K. H. Lee, “Depth image processing technique for representing human actors in 3DTV using single depth camera,” in 2007 3DTV Conference, 2007, pp. 1–4.
  • J.-S. Jeong, K.-C. Kwon, M.-U. Erdenebat, Y. Piao, N. Kim, and K.-H. Yoo, “Development of a real-time integral imaging display system based on graphics processing unit parallel processing using a depth camera,” Opt. Eng., vol. 53, no. 1, p. 15103, 2014.
  • K. Adi and C. E. Widodo, “Distance Measurement With a Stereo Camera,” Int. J. Innov. Res. Adv. Eng., 2017.
  • J. B. Kim, “Efficient vehicle detection and distance estimation based on aggregated channel features and inverse perspective mapping from a single camera,” Symmetry (Basel)., 2019, doi: 10.3390/sym11101205.
  • G. Lin, Y. Tang, X. Zou, J. Xiong, and J. Li, “Guava detection and pose estimation using a low-cost RGB-D sensor in the field,” Sensors (Switzerland), 2019, doi: 10.3390/s19020428.
  • B. Parr, M. Legg, and F. Alam, “Analysis of Depth Cameras for Proximal Sensing of Grapes,” Sensors, vol. 22, no. 11, 2022, doi: 10.3390/s22114179.
  • P. Kurtser, O. Ringdahl, N. Rotstein, R. Berenstein, and Y. Edan, “In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level RGB-D Camera,” IEEE Robot. Autom. Lett., 2020, doi: 10.1109/LRA.2020.2970654.
  • Z. Wang, K. B. Walsh, and B. Verma, “On-tree mango fruit size estimation using RGB-D images,” Sensors (Switzerland), 2017, doi: 10.3390/s17122738.
  • N. Maeda, H. Suzuki, T. Kitajima, A. Kuwahara, and T. Yasuno, “Measurement of Tomato Leaf Area Using Depth Camera,” J. Signal Process., vol. 26, no. 4, pp. 123–126, 2022.
  • B. Zheng, G. Sun, Z. Meng, and R. Nan, “Vegetable Size Measurement Based on Stereo Camera and Keypoints Detection,” Sensors, vol. 22, no. 4, 2022, doi: 10.3390/s22041617.
  • C. Vo-Le, P. Van Muoi, N. H. Son, N. Van San, V. K. Duong, and N. T. Huyen, “Automatic Method for Measuring Object Size Using 3D Camera,” ICCE 2020 - 2020 IEEE 8th Int. Conf. Commun. Electron., pp. 365–369, 2021, doi: 10.1109/ICCE48956.2021.9352115.
  • H. Xu, J. Xu, and W. Xu, “Survey of 3D modeling using depth cameras,” Virtual Reality and Intelligent Hardware. 2019. doi: 10.1016/j.vrih.2019.09.003.
  • L. Wu, Y. Long, H. Sun, N. Liu, W. Wang, and Q. Dong, “Length Measurement of Potato Leaf using Depth Camera,” 2018. doi: 10.1016/j.ifacol.2018.08.197.
  • A. Ruchay, V. Kober, K. Dorofeev, V. Kolpakov, and S. Miroshnikov, “Accurate body measurement of live cattle using three depth cameras and non-rigid 3-D shape recovery,” Comput. Electron. Agric., 2020, doi: 10.1016/j.compag.2020.105821.
  • B. A. Griffin and J. J. Corso, “Depth from Camera Motion and Object Detection,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., pp. 1397–1406, 2021, doi: 10.1109/CVPR46437.2021.00145.
  • L. Keselman et al., “Intel RealSense Stereoscopic Depth Cameras,” Comput. Vis. Pattern Recognit., 2017.
Toplam 19 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Yapay Zeka
Bölüm Araştırma Makaleleri
Yazarlar

Bilal Gürevin 0000-0003-4035-2759

Ramazan Gül 0000-0001-5350-4753

Sema Eğri 0000-0001-9587-2108

Furkan Gültürk 0000-0002-2494-487X

Muhammed Yıldız 0000-0002-0530-2345

Fatih Çalışkan 0000-0002-9568-7049

İhsan Pehlivan 0000-0001-6107-655X

Erken Görünüm Tarihi 18 Mayıs 2023
Yayımlanma Tarihi 18 Mayıs 2023
Gönderilme Tarihi 15 Mayıs 2023
Yayımlandığı Sayı Yıl 2023 Cilt: 11 Sayı: 2

Kaynak Göster

IEEE B. Gürevin, “A Novel Method Determining the Size and Angle of an Object Using a Depth Camera Without Reference”, APJESS, c. 11, sy. 2, ss. 41–46, 2023, doi: 10.21541/apjess.1297168.

Academic Platform Journal of Engineering and Smart Systems