Araştırma Makalesi
BibTex RIS Kaynak Göster

Data Transfer Platform Design for Kinect Platform for Kinect Applications

Yıl 2019, Cilt: 7 Sayı: 3, 473 - 480, 28.09.2019
https://doi.org/10.21541/apjes.451125

Öz

In recent years, software, hardware, and
algorithms have come to fruition. These developments in technology have also
affected sensor technologies. The Kinect sensor, initially marketed as a gaming
device, has been met with great interest by both researchers and developers.
The Kinect sensor has been used in different areas in the literature for
different purposes. All data from the Kinect sensor is transmitted to
developers through the Software Development Kit ( SDK)
developed by Microsoft. The Kinect sensor produces between 240 and 270 thousand
points per second in normal conditions, depending on scene complexity. The
purpose of this work is to design a data transfer platform for Kinect
applications. The developed platform works on client server architecture. The
platform, which has different scenarios for online and offline communication
situations, also offers several filtering and encryption algorithms. The
platform uses the Point Cloud Library (PCL), a large-scale open source project
for 2D / 3D image and point cloud processing. VoxelGrid (VG) Filter, Outlier
Filter, Histogram Based Conditional Filter, Octree-based Compression and PGP
Encryption methods are also available on request. In addition, a special data
structure has been developed for Kinect applications. WebRTC middleware software
is used for online communication. Thanks to all these steps, 
unnecessary data points have been cleaned, compressed,
secured and data packages suitable for the developed data structure have been
obtained. Because of the filtering, a compression ratio of 19.96% has been
obtained. Application-based or client-based filtering is provided through
custom design. A file compression result of 10.38% has been obtained with the
file compression approach applied after filtering. The presented platform will
provide performance for Kinect applications used by researchers and developers.

Kaynakça

  • [1] Z. Zhang, "Microsoft Kinect Sensor and Its Effect," IEEE Multimedia, vol. 19, no. 2, pp. 4-10, 2012.
  • [2] M. Gabel, R. Gilad-Bachrach, E. Renshaw, and A. Schuster, "Full body gait analysis with Kinect," Conf Proc IEEE Eng Med Biol Soc, vol. 2012, pp. 1964-7, 2012.
  • [3] N. Kitsunezaki, E. Adachi, T. Masuda, and J. Mizusawa, "KINECT applications for the physical rehabilitation," pp. 294-299, 2013.
  • [4] B. Lange et al., "Interactive game-based rehabilitation using the Microsoft Kinect," pp. 171-172, 2012.
  • [5] T. Dutta, "Evaluation of the Kinect sensor for 3-D kinematic measurement in the workplace," Appl Ergon, vol. 43, no. 4, pp. 645-9, Jul 2012.
  • [6] I. P. T. Weerasinghe, J. Y. Ruwanpura, J. E. Boyd, and A. F. Habib, "Application of Microsoft Kinect Sensor for Tracking Construction Workers," pp. 858-867, 2012.
  • [7] Z. Zhang, M. Zhang, Y. Chang, E.-S. Aziz, S. K. Esche, and C. Chassapis, "Real-Time 3D Model Reconstruction and Interaction Using Kinect for a Game-Based Virtual Laboratory," p. V005T05A053, 2013.
  • [8] L. Cruz, D. Lucio, and L. Velho, "Kinect and RGBD Images: Challenges and Applications," pp. 36-49, 2012.
  • [9] H. Richards-Rissetto, J. von Schwerin, and G. Girardi, "Kinect and 3D GIS in archaeology," pp. 331-337, 2012.
  • [10] S. Izadi et al., "KinectFusion," p. 559, 2011.
  • [11] G. Du and P. Zhang, "Markerless human–robot interface for dual robot manipulators using Kinect sensor," Robotics and Computer-Integrated Manufacturing, vol. 30, no. 2, pp. 150-159, 2014.
  • [12] S. Zolkiewski and D. Pioskowik, "Robot Control and Online Programming by Human Gestures Using a Kinect Motion Sensor," vol. 275, pp. 593-604, 2014.
  • [13] R. A. El-laithy, J. Huang, and M. Yeh, "Study on the use of Microsoft Kinect for robotics applications," pp. 1280-1288, 2012.
  • [14] M. Eiji, C. Meifen, M. Toshiyuki, and H. Hiroshi, "Human motion tracking of mobile robot with Kinect 3D sensor," presented at the 2012 Proceedings of SICE Annual Conference (SICE), Akita, Japan, 20-23 Aug. 2012,
  • [15] Z.-R. Tsai, "Robust Kinect-based guidance and positioning of a multidirectional robot by Log-ab recognition," Expert Systems with Applications, vol. 41, no. 4, pp. 1271-1282, 2014.
  • [16] B. Lau, C. Sprunk, and W. Burgard, "Efficient grid-based spatial representations for robot navigation in dynamic environments," Robotics and Autonomous Systems, vol. 61, no. 10, pp. 1116-1130, 2013.
  • [17] K. Berger, S. Meister, R. Nair, and D. Kondermann, "A State of the Art Report on Kinect Sensor Setups in Computer Vision," vol. 8200, pp. 257-272, 2013.
  • [18] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, and A. E. Saddik, "Evaluating and Improving the Depth Accuracy of Kinect for Windows v2," IEEE Sensors Journal, vol. 15, no. 8, pp. 4275-4285, 2015.
  • [19] B. Galna, G. Barry, D. Jackson, D. Mhiripiri, P. Olivier, and L. Rochester, "Accuracy of the Microsoft Kinect sensor for measuring movement in people with Parkinson's disease," Gait Posture, vol. 39, no. 4, pp. 1062-8, Apr 2014.
  • [20] C. Raposo, J. P. Barreto, and U. Nunes, "Fast and Accurate Calibration of a Kinect Sensor," pp. 342-349, 2013.
  • [21] K. Khoshelham and S. O. Elberink, "Accuracy and resolution of Kinect depth data for indoor mapping applications," Sensors (Basel), vol. 12, no. 2, pp. 1437-54, 2012.
  • [22] H. Gonzalez-Jorge, B. Riveiro, E. Vazquez-Fernandez, J. Martínez-Sánchez, and P. Arias, "Metrological evaluation of Microsoft Kinect and Asus Xtion sensors," Measurement, vol. 46, no. 6, pp. 1800-1806, 2013.
  • [23] J. Kramer, N. Burrus, F. Echtler, H. C. Daniel, and M. Parker, Hacking the Kinect. Apress, 2012.
  • [24] L. Caruso, R. Russo, and S. Savino, "Microsoft Kinect V2 vision system in a manufacturing application," Robotics and Computer-Integrated Manufacturing, vol. 48, pp. 174-181, 2017.
  • [25] R. B. Rusu and S. Cousins, "3D is here: Point Cloud Library (PCL)," pp. 1-4, 2011.
  • [26] E. Bertin, S. Cubaud, S. Tuffin, N. Crespi, and V. Beltran, "WebRTC, the day after: What's next for conversational services?," pp. 46-52, 2013.
  • [27] S. Loreto and S. P. Romano, "Real-Time Communications in the Web: Issues, Achievements, and Ongoing Standardization Efforts," IEEE Internet Computing, vol. 16, no. 5, pp. 68-73, 2012.
  • [28] J. Alan and B. Daniel, WebRTC: APIs and RTCWEB Protocols of the HTML5 Real-Time Web. Digital Codex LLC, 2014.
  • [29] C. Connolly, "Cumulative generation of octree models from range data," vol. 1, pp. 25-32, 1984.
  • [30] L. Kobbelt and M. Botsch, "A survey of point-based techniques in computer graphics," Computers & Graphics, vol. 28, no. 6, pp. 801-814, 2004.
  • [31] H. Tsaknakis and P. Papantoni-Kazakos, "Outlier resistant filtering and smoothing," Information and Computation, vol. 79, no. 2, pp. 163-192, 1988.
  • [32] Y. Wang and H.-Y. Feng, "Outlier detection for scanned point clouds using majority voting," Computer-Aided Design, vol. 62, pp. 31-43, 2015.
  • [33] T. Kaur and R. K. Sidhu, "Performance Evaluation of Fuzzy and Histogram Based Color Image Enhancement," Procedia Computer Science, vol. 58, pp. 470-477, 2015.
  • [34] V. Rajinikanth and M. S. Couceiro, "RGB Histogram Based Color Image Segmentation Using Firefly Algorithm," Procedia Computer Science, vol. 46, pp. 1449-1457, 2015.
  • [35] J. Navarrete, D. Viejo, and M. Cazorla, "Compression and registration of 3D point clouds using GMMs," Pattern Recognition Letters, vol. 110, pp. 8-15, 2018.
  • [36] V. Morell, S. Orts, M. Cazorla, and J. Garcia-Rodriguez, "Geometric 3D point cloud compression," Pattern Recognition Letters, vol. 50, pp. 55-62, 2014.
  • [37] H. Meyer, "Privacy better than ‘Pretty Good’," Computers & Security, vol. 16, no. 7, p. 620, 1997.
  • [38] B. Zajac, "Pretty good privacy," Computer Fraud & Security Bulletin, vol. 1994, no. 9, pp. 14-17, 1994.
  • [39] M. Becke, E. P. Rathgeb, S. Werner, I. Rungeler, M. Tuxen, and R. Stewart, "Data channel considerations for RTCWeb," IEEE Communications Magazine, vol. 51, no. 4, pp. 34-41, 2013.

Kinect Uygulamaları için Veri Transfer Platformu Tasarımı

Yıl 2019, Cilt: 7 Sayı: 3, 473 - 480, 28.09.2019
https://doi.org/10.21541/apjes.451125

Öz

Son yıllarda yazılım, donanım ve algoritma konularında büyük gelişmeler meydana
gelmiştir.  Teknolojide yaşanan bu
gelişmeler sensör teknolojilerini de etkilemiştir. Başlangıçta bir oyun cihazı
olarak piyasaya sürülen Kinect sensör gerek araştırmacılar gerek geliştiriciler
tarafından büyük ilgiyle karşılanmıştır. Kinect sensör literatürde farklı alanlarda
farklı amaçlar için kullanılmıştır. Kinect sensörden alınan tüm veriler
Microsoft tarafından geliştirilen Yazılım Geliştirme Kiti (YGK) ile
geliştiricilere iletilmektedir. Kinect sensörü sahne karmaşıklığına göre
değişmek üzere normal durumlarda her saniyede 240 bin ile 270 bin nokta verisi
üretmektedir. Bu çalışmanın amacı Kinect uygulamaları için veri transfer
platformu tasarlanmasıdır. Geliştirilen platform istemci sunucu mimarisi
üzerinde çalışmaktadır. Çevrimiçi ve çevrimdışı haberleşme durumlara uygun
farklı senaryolar barındıran platform, aynı zamanda bir dizi filtreleme ve
şifreleme algoritmalarını da sunmaktadır. Platformda 2D/3D görüntü ve nokta
bulutu işleme için büyük ölçekli, açık kaynaklı bir proje olan Nokta Bulut
Kütüphanesini (NBK) kullanılmıştır. İsteğe bağlı
olarak VoxelGrid (VG) Filtre, Outlier Filtre, Histogram Tabanlı Koşullu Filtre,
Octree-tabanlı Sıkıştırma ve PGP Şifreleme yöntemlerini de barındırmaktadır.
Ayrıca Kinect uygulamalarına özel bir veri yapısı da geliştirilmiştir.
Çevrimiçi haberleşme için WebRTC ara katman yazılımı kullanılmıştır. Tüm bu
aşamalar sonucunda gereksiz veri noktaları temizlenmiş, sıkıştırılmış, güvenli
hale getirilmiş ve geliştirilen veri yapısına uygun veri paketleri elde edilmiştir.
Filtrelemeler sonucunda % 19.96 sıkıştırma oranı elde edilmiştir. İsteğe bağlı
tasarım sayesinde uygulama veya istemci bazlı filtreleme sağlanmıştır. Filtrelemeler
sonrasında uygulanan dosya sıkıştırma yaklaşımı ile % 10.38 oranında dosya
sıkıştırma sonucu da elde edilmiştir. Sunulan platform araştırmacılar ve
geliştiriciler tarafından kullanılan Kinect uygulamalarında performans
sağlayacaktır.

Kaynakça

  • [1] Z. Zhang, "Microsoft Kinect Sensor and Its Effect," IEEE Multimedia, vol. 19, no. 2, pp. 4-10, 2012.
  • [2] M. Gabel, R. Gilad-Bachrach, E. Renshaw, and A. Schuster, "Full body gait analysis with Kinect," Conf Proc IEEE Eng Med Biol Soc, vol. 2012, pp. 1964-7, 2012.
  • [3] N. Kitsunezaki, E. Adachi, T. Masuda, and J. Mizusawa, "KINECT applications for the physical rehabilitation," pp. 294-299, 2013.
  • [4] B. Lange et al., "Interactive game-based rehabilitation using the Microsoft Kinect," pp. 171-172, 2012.
  • [5] T. Dutta, "Evaluation of the Kinect sensor for 3-D kinematic measurement in the workplace," Appl Ergon, vol. 43, no. 4, pp. 645-9, Jul 2012.
  • [6] I. P. T. Weerasinghe, J. Y. Ruwanpura, J. E. Boyd, and A. F. Habib, "Application of Microsoft Kinect Sensor for Tracking Construction Workers," pp. 858-867, 2012.
  • [7] Z. Zhang, M. Zhang, Y. Chang, E.-S. Aziz, S. K. Esche, and C. Chassapis, "Real-Time 3D Model Reconstruction and Interaction Using Kinect for a Game-Based Virtual Laboratory," p. V005T05A053, 2013.
  • [8] L. Cruz, D. Lucio, and L. Velho, "Kinect and RGBD Images: Challenges and Applications," pp. 36-49, 2012.
  • [9] H. Richards-Rissetto, J. von Schwerin, and G. Girardi, "Kinect and 3D GIS in archaeology," pp. 331-337, 2012.
  • [10] S. Izadi et al., "KinectFusion," p. 559, 2011.
  • [11] G. Du and P. Zhang, "Markerless human–robot interface for dual robot manipulators using Kinect sensor," Robotics and Computer-Integrated Manufacturing, vol. 30, no. 2, pp. 150-159, 2014.
  • [12] S. Zolkiewski and D. Pioskowik, "Robot Control and Online Programming by Human Gestures Using a Kinect Motion Sensor," vol. 275, pp. 593-604, 2014.
  • [13] R. A. El-laithy, J. Huang, and M. Yeh, "Study on the use of Microsoft Kinect for robotics applications," pp. 1280-1288, 2012.
  • [14] M. Eiji, C. Meifen, M. Toshiyuki, and H. Hiroshi, "Human motion tracking of mobile robot with Kinect 3D sensor," presented at the 2012 Proceedings of SICE Annual Conference (SICE), Akita, Japan, 20-23 Aug. 2012,
  • [15] Z.-R. Tsai, "Robust Kinect-based guidance and positioning of a multidirectional robot by Log-ab recognition," Expert Systems with Applications, vol. 41, no. 4, pp. 1271-1282, 2014.
  • [16] B. Lau, C. Sprunk, and W. Burgard, "Efficient grid-based spatial representations for robot navigation in dynamic environments," Robotics and Autonomous Systems, vol. 61, no. 10, pp. 1116-1130, 2013.
  • [17] K. Berger, S. Meister, R. Nair, and D. Kondermann, "A State of the Art Report on Kinect Sensor Setups in Computer Vision," vol. 8200, pp. 257-272, 2013.
  • [18] L. Yang, L. Zhang, H. Dong, A. Alelaiwi, and A. E. Saddik, "Evaluating and Improving the Depth Accuracy of Kinect for Windows v2," IEEE Sensors Journal, vol. 15, no. 8, pp. 4275-4285, 2015.
  • [19] B. Galna, G. Barry, D. Jackson, D. Mhiripiri, P. Olivier, and L. Rochester, "Accuracy of the Microsoft Kinect sensor for measuring movement in people with Parkinson's disease," Gait Posture, vol. 39, no. 4, pp. 1062-8, Apr 2014.
  • [20] C. Raposo, J. P. Barreto, and U. Nunes, "Fast and Accurate Calibration of a Kinect Sensor," pp. 342-349, 2013.
  • [21] K. Khoshelham and S. O. Elberink, "Accuracy and resolution of Kinect depth data for indoor mapping applications," Sensors (Basel), vol. 12, no. 2, pp. 1437-54, 2012.
  • [22] H. Gonzalez-Jorge, B. Riveiro, E. Vazquez-Fernandez, J. Martínez-Sánchez, and P. Arias, "Metrological evaluation of Microsoft Kinect and Asus Xtion sensors," Measurement, vol. 46, no. 6, pp. 1800-1806, 2013.
  • [23] J. Kramer, N. Burrus, F. Echtler, H. C. Daniel, and M. Parker, Hacking the Kinect. Apress, 2012.
  • [24] L. Caruso, R. Russo, and S. Savino, "Microsoft Kinect V2 vision system in a manufacturing application," Robotics and Computer-Integrated Manufacturing, vol. 48, pp. 174-181, 2017.
  • [25] R. B. Rusu and S. Cousins, "3D is here: Point Cloud Library (PCL)," pp. 1-4, 2011.
  • [26] E. Bertin, S. Cubaud, S. Tuffin, N. Crespi, and V. Beltran, "WebRTC, the day after: What's next for conversational services?," pp. 46-52, 2013.
  • [27] S. Loreto and S. P. Romano, "Real-Time Communications in the Web: Issues, Achievements, and Ongoing Standardization Efforts," IEEE Internet Computing, vol. 16, no. 5, pp. 68-73, 2012.
  • [28] J. Alan and B. Daniel, WebRTC: APIs and RTCWEB Protocols of the HTML5 Real-Time Web. Digital Codex LLC, 2014.
  • [29] C. Connolly, "Cumulative generation of octree models from range data," vol. 1, pp. 25-32, 1984.
  • [30] L. Kobbelt and M. Botsch, "A survey of point-based techniques in computer graphics," Computers & Graphics, vol. 28, no. 6, pp. 801-814, 2004.
  • [31] H. Tsaknakis and P. Papantoni-Kazakos, "Outlier resistant filtering and smoothing," Information and Computation, vol. 79, no. 2, pp. 163-192, 1988.
  • [32] Y. Wang and H.-Y. Feng, "Outlier detection for scanned point clouds using majority voting," Computer-Aided Design, vol. 62, pp. 31-43, 2015.
  • [33] T. Kaur and R. K. Sidhu, "Performance Evaluation of Fuzzy and Histogram Based Color Image Enhancement," Procedia Computer Science, vol. 58, pp. 470-477, 2015.
  • [34] V. Rajinikanth and M. S. Couceiro, "RGB Histogram Based Color Image Segmentation Using Firefly Algorithm," Procedia Computer Science, vol. 46, pp. 1449-1457, 2015.
  • [35] J. Navarrete, D. Viejo, and M. Cazorla, "Compression and registration of 3D point clouds using GMMs," Pattern Recognition Letters, vol. 110, pp. 8-15, 2018.
  • [36] V. Morell, S. Orts, M. Cazorla, and J. Garcia-Rodriguez, "Geometric 3D point cloud compression," Pattern Recognition Letters, vol. 50, pp. 55-62, 2014.
  • [37] H. Meyer, "Privacy better than ‘Pretty Good’," Computers & Security, vol. 16, no. 7, p. 620, 1997.
  • [38] B. Zajac, "Pretty good privacy," Computer Fraud & Security Bulletin, vol. 1994, no. 9, pp. 14-17, 1994.
  • [39] M. Becke, E. P. Rathgeb, S. Werner, I. Rungeler, M. Tuxen, and R. Stewart, "Data channel considerations for RTCWeb," IEEE Communications Magazine, vol. 51, no. 4, pp. 34-41, 2013.
Toplam 39 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Mühendislik
Bölüm Makaleler
Yazarlar

Erdal Erdal 0000-0003-1174-1974

Atilla Ergüzen 0000-0003-4562-2578

Yayımlanma Tarihi 28 Eylül 2019
Gönderilme Tarihi 6 Ağustos 2018
Yayımlandığı Sayı Yıl 2019 Cilt: 7 Sayı: 3

Kaynak Göster

IEEE E. Erdal ve A. Ergüzen, “Kinect Uygulamaları için Veri Transfer Platformu Tasarımı”, APJES, c. 7, sy. 3, ss. 473–480, 2019, doi: 10.21541/apjes.451125.