Research Article
BibTex RIS Cite

Hareket Yakalama ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon

Year 2021, Volume: 9 Issue: 6 - ICAIAME 2021, 269 - 279, 31.12.2021
https://doi.org/10.29130/dubited.1015397

Abstract

İnsanların kas ve sinir sisteminin tahribatı ile ortaya çıkan hastalıklar hayat kalitesi üzerinde ciddi etkiler göstermektedir. Bu hastalıklardan en önemlilerden biri hemiplejidir. Hemipleji, diğer bir adıyla kısmi felç, vücudun sol ve sağ bölgelerini etkileyen sinir sistemi hastalığıdır. Söz konusu hastalıkta, bireylerin beyinlerinde meydana gelen hasarlardan dolayı hareket edememe veya hareket etmekte güçlük yaşanılması gibi sorunlar oluşmaktadır. Bu hastalıkta tedavi ve rehabilitasyon aşaması son derece önemlidir. Hastalığı erken teşhis ederek rehabilitasyon süreci hemen başlatılmalıdır. Diğer vücut fonksiyonlarına zarar verilmeden iyileşme sağlanması tedavinin temel amacıdır. Çalışmamızda, rehabilitasyon süreci aşamasındaki hastaların hareketlerini algılayarak, oyun tabanlı bir sanal gerçeklik uygulaması geliştirilmiştir. Hastanın parmaklarına 10 adet esneklik sensörü ve eklemlerine 13 adet MPU9250 eğim sensörü olmak üzere toplamda 23 adet sensör yerleştirilmiştir. Sensörlerden alınan veriler öncelikle kalibre edilmiştir. Kalibre edilmiş sensörlerden, sanal gerçeklik gözlüğüne gelen gerçek zamanlı veriler ile hemipleji hastalarının hareketleri algılanmıştır. Hemipleji hastalarına uzman fizyoterapistler tarafından verilen hareketlere uygun oyun modu tasarlanmıştır. Sanal gerçeklik gözlüğü takılı olan hasta, oyun moduna göre oyun oynayabilmektedir. Gözlükte gösterilen ve uygulanması istenilen oyun, fizyoterapistler tarafından belirlenmiş hareketlere bağlı bir oyundur. Çalışmanın hemipleji hastalarının iyileşme sürecine önemli katkı sağlayacağı düşünülmektedir.

Supporting Institution

Tübitak

Project Number

3200662

Thanks

Bu çalışma, Tübitak Teydeb 1501 proje kapsamında desteklenmektedir. (Proje no: 3200662).

References

  • [1] B. Gündüz, “İnme ve prognozu etkileyen faktörler,” Türk Fizyoterapi ve Rehabilitasyon Dergisi, ss. 52, 2006.
  • [2] İ. Mirzayev, “İnme hastalarında sanal gerçeklik eğitiminin üst ekstremite fonksiyonlarına etkisinin araştırılması,” Uzmanlık tezi, Fiziksel Tıp ve Rehabilitasyon Anabilim Dalı, Başkent Üniversitesi, Ankara, Türkiye, 2015.
  • [3] U. Fidan ve Ö. Neşe, “Nörolojik rehabilitasyon için kinect sensörlü ölçüm ve egzersiz sisteminin tasarımı ve gerçekleştirilmesi,” Afyon Kocatepe Üniversitesi Fen ve Mühendislik Bilim. Dergisi, c. 18, s. 2, ss. 727–733, 2018.
  • [4] E. Bayraktar, “Sanal gerçeklik ve uygulama alanları,” Akademik Bilişim, Kütahya, Türkiye 2007, ss. 1–6.
  • [5] D. Thewlis, C. Bishop, N. Daniell, and G. Paul, “Next-generation low-cost motion capture systems can provide comparable spatial accuracy to high-end systems dominic,” Journal of Applied Biomechanics, vol. 29, no. 1, pp. 112–117, 2013.
  • [6] S. Chanpimol, B. Seamon, H. Hernandez, M. Harris-Love, and M. R. Blackman, “Using Xbox kinect motion capture technology to improve clinical rehabilitation outcomes for balance and cardiovascular health in an individual with chronic TBI,” Archives of Physiotherapy, vol. 7, no. 1, pp. 1–11, May 2017.
  • [7] J. N. Carlson, S. Das, F. D. la Torre, C. W. Callaway, P. E. Phrampus, and J. Hodgins, “Motion capture measures variability in laryngoscopic movement during endotracheal intubation: A preliminary report,” Simulation in Healthcare, vol. 7, no. 4, pp. 255, Aug. 2012.
  • [8] D. C. Wierschem, J. A. Jimenez, and F. A. Méndez Mediavilla, “A motion capture system for the study of human manufacturing repetitive motions,” The International Journal of Advanced Manufacturing Technology., vol. 110, no. 3, pp. 813–827, Aug. 2020.
  • [9] M. Jebeli, A. Bilesan, and A. Arshi, “A study on validating KinectV2 in comparison of Vicon system as a motion capture system for using in Health Engineering in industry,” Nonlinear Engineering, vol. 6, no. 2, pp. 95–99, Jun. 2017.
  • [10] S. W. Park, H. S. Park, J. H. Kim, and H. Adeli, “3D displacement measurement model for health monitoring of structures using a motion capture system,” Measurement, vol. 59, pp. 352–362, Jan. 2015.
  • [11] A. Fernández-Baena, A. Susín, and X. Lligadas, “Biomechanical validation of upper-body and lower-body joint movements of kinect motion capture data for rehabilitation treatments,” in 2012 fourth international conference on intelligent networking and collaborative systems, Bucharest, Romania, 2012, pp. 656–661.
  • [12] T. Sakaguchi, T. Kanamori, H. Katayose, K. Sato, and S. Inokuchi, “Human motion capture by integrating gyroscopes and accelerometers,” in International Conference on Multisensor Fusion and Integration for Intelligent Systems Washington, DC, USA, 1996, pp. 470–475.
  • [13] E. Pruna et al., “3D virtual system trough 3 space mocap sensors for lower limb rehabilitation,” in International Conference on Augmented Reality, Virtual Reality and Computer Graphics., Ugento, Italy, 2017, vol. 10325 LNCS, pp. 119–128.
  • [14] Y. Fujimori, Y. Ohmura, T. Harada, and Y. Kuniyoshi, “Wearable motion capture suit with full-body tactile sensors,” in IEEE International Conference on Robotics and Automation, Kobe, Japan, 2009, pp. 3186–3193.
  • [15] L. Gao, G. Zhang, B. Yu, Z. Qiao, and J. Wang, “Wearable human motion posture capture and medical health monitoring based on wireless sensor networks,” Measurement, vol. 166, pp. 108252, Dec. 2020.
  • [16] C. M. N. Brigante, N. Abbate, A. Basile, A. C. Faulisi, and S. Sessa, “Towards miniaturization of a MEMS-based wearable motion capture system,” IEEE Transactions on Industrial Electronics, vol. 58, no. 8, pp. 3234–3241, Aug. 2011.
  • [17] C. Treffers and L. van Wietmarschen, “Position and orientation determination of a probe with use of the IMU MPU9250 and a ATmega328 microcontroller,” Bachelor thesis, Electrical Engineering, Delft University of Technology, Netherlands, 2016.
  • [18] “Sparkfun Kit. (2010, June 16). Flex Sensor Data Sheet ’10 [Online]. Available: https://cdn-shop.adafruit.com/datasheets/SpectraFlex2inch.pdf.
  • [19] M. Bálský, M. Kozlok, and R. Bayer, “Application of Arduino platform for light field analysis,” in 7th Lighting Conference of the Visegrad Countries, LUMEN V4, Trebic, Czech, Nov. 2018, pp. 1–4.
  • [20] A. Bujnowski et al., “Smart weighing scale with feet-sampled ECG,” in Proceedings: IECON 2018 - 44th Annual Conference of the IEEE Industrial Electronics Society, Washington DC, USA, Dec. 2018, pp. 3286–3291.
  • [21] Karadeniz Teknik Üniversitesi Bilgisayar Mühendisliği Bölümü Mikroişlemciler Laboratuarı. (2017, 6 Ekim). Arm Mimarisi ve Uygulamaları [Çevrimiçi]. Erişim: https://www.ktu.edu.tr/dosyalar/bilgisayar_a7670.pdf .
  • [22] S. Wang, Z. Mao, C. Zeng, H. Gong, S. Li, and B. Chen, “A new method of virtual reality based on Unity3D,” in 18th international conference on Geoinformatics, Beijing, China, 2010 pp. 1–5.

Game Based Rehabilitation Using Motion Capture and Virtual Reality Technologies

Year 2021, Volume: 9 Issue: 6 - ICAIAME 2021, 269 - 279, 31.12.2021
https://doi.org/10.29130/dubited.1015397

Abstract

Diseases that occur with the destruction of the muscle and nervous system of people have serious effects on the quality of life. One of the most important of these diseases is hemiplegia. Hemiplegia, also known as partial paralysis, is a nervous system disease that affects the left and right parts of the body. In the aforementioned disease, problems such as inability to move or having difficulty in moving occur due to damage to the brains of individuals. The treatment and rehabilitation phase of this disease is extremely important. By diagnosing the disease early, the rehabilitation process should be started immediately. The main aim of treatment is to achieve recovery without harming other body functions. In this study, a game-based virtual reality application was developed by detecting the movements of the patients in the rehabilitation process. 10 flexibility sensors were placed on the fingers of the patients and 13 MPU9250 inertial measurement unit were placed on their joints. A total of 23 sensors were used. The data received from the sensors are first calibrated. The movements of hemiplegia patients were detected with real-time data coming from the calibrated sensors to the virtual reality glasses. A game mode has been designed in accordance with the movements given to hemiplegic patients by expert physiotherapists. The game shown on the glasses and requested to be applied is a game based on the movements determined by the physiotherapists. It is thought that the study will contribute significantly to the healing process of hemiplegia patients.

Project Number

3200662

References

  • [1] B. Gündüz, “İnme ve prognozu etkileyen faktörler,” Türk Fizyoterapi ve Rehabilitasyon Dergisi, ss. 52, 2006.
  • [2] İ. Mirzayev, “İnme hastalarında sanal gerçeklik eğitiminin üst ekstremite fonksiyonlarına etkisinin araştırılması,” Uzmanlık tezi, Fiziksel Tıp ve Rehabilitasyon Anabilim Dalı, Başkent Üniversitesi, Ankara, Türkiye, 2015.
  • [3] U. Fidan ve Ö. Neşe, “Nörolojik rehabilitasyon için kinect sensörlü ölçüm ve egzersiz sisteminin tasarımı ve gerçekleştirilmesi,” Afyon Kocatepe Üniversitesi Fen ve Mühendislik Bilim. Dergisi, c. 18, s. 2, ss. 727–733, 2018.
  • [4] E. Bayraktar, “Sanal gerçeklik ve uygulama alanları,” Akademik Bilişim, Kütahya, Türkiye 2007, ss. 1–6.
  • [5] D. Thewlis, C. Bishop, N. Daniell, and G. Paul, “Next-generation low-cost motion capture systems can provide comparable spatial accuracy to high-end systems dominic,” Journal of Applied Biomechanics, vol. 29, no. 1, pp. 112–117, 2013.
  • [6] S. Chanpimol, B. Seamon, H. Hernandez, M. Harris-Love, and M. R. Blackman, “Using Xbox kinect motion capture technology to improve clinical rehabilitation outcomes for balance and cardiovascular health in an individual with chronic TBI,” Archives of Physiotherapy, vol. 7, no. 1, pp. 1–11, May 2017.
  • [7] J. N. Carlson, S. Das, F. D. la Torre, C. W. Callaway, P. E. Phrampus, and J. Hodgins, “Motion capture measures variability in laryngoscopic movement during endotracheal intubation: A preliminary report,” Simulation in Healthcare, vol. 7, no. 4, pp. 255, Aug. 2012.
  • [8] D. C. Wierschem, J. A. Jimenez, and F. A. Méndez Mediavilla, “A motion capture system for the study of human manufacturing repetitive motions,” The International Journal of Advanced Manufacturing Technology., vol. 110, no. 3, pp. 813–827, Aug. 2020.
  • [9] M. Jebeli, A. Bilesan, and A. Arshi, “A study on validating KinectV2 in comparison of Vicon system as a motion capture system for using in Health Engineering in industry,” Nonlinear Engineering, vol. 6, no. 2, pp. 95–99, Jun. 2017.
  • [10] S. W. Park, H. S. Park, J. H. Kim, and H. Adeli, “3D displacement measurement model for health monitoring of structures using a motion capture system,” Measurement, vol. 59, pp. 352–362, Jan. 2015.
  • [11] A. Fernández-Baena, A. Susín, and X. Lligadas, “Biomechanical validation of upper-body and lower-body joint movements of kinect motion capture data for rehabilitation treatments,” in 2012 fourth international conference on intelligent networking and collaborative systems, Bucharest, Romania, 2012, pp. 656–661.
  • [12] T. Sakaguchi, T. Kanamori, H. Katayose, K. Sato, and S. Inokuchi, “Human motion capture by integrating gyroscopes and accelerometers,” in International Conference on Multisensor Fusion and Integration for Intelligent Systems Washington, DC, USA, 1996, pp. 470–475.
  • [13] E. Pruna et al., “3D virtual system trough 3 space mocap sensors for lower limb rehabilitation,” in International Conference on Augmented Reality, Virtual Reality and Computer Graphics., Ugento, Italy, 2017, vol. 10325 LNCS, pp. 119–128.
  • [14] Y. Fujimori, Y. Ohmura, T. Harada, and Y. Kuniyoshi, “Wearable motion capture suit with full-body tactile sensors,” in IEEE International Conference on Robotics and Automation, Kobe, Japan, 2009, pp. 3186–3193.
  • [15] L. Gao, G. Zhang, B. Yu, Z. Qiao, and J. Wang, “Wearable human motion posture capture and medical health monitoring based on wireless sensor networks,” Measurement, vol. 166, pp. 108252, Dec. 2020.
  • [16] C. M. N. Brigante, N. Abbate, A. Basile, A. C. Faulisi, and S. Sessa, “Towards miniaturization of a MEMS-based wearable motion capture system,” IEEE Transactions on Industrial Electronics, vol. 58, no. 8, pp. 3234–3241, Aug. 2011.
  • [17] C. Treffers and L. van Wietmarschen, “Position and orientation determination of a probe with use of the IMU MPU9250 and a ATmega328 microcontroller,” Bachelor thesis, Electrical Engineering, Delft University of Technology, Netherlands, 2016.
  • [18] “Sparkfun Kit. (2010, June 16). Flex Sensor Data Sheet ’10 [Online]. Available: https://cdn-shop.adafruit.com/datasheets/SpectraFlex2inch.pdf.
  • [19] M. Bálský, M. Kozlok, and R. Bayer, “Application of Arduino platform for light field analysis,” in 7th Lighting Conference of the Visegrad Countries, LUMEN V4, Trebic, Czech, Nov. 2018, pp. 1–4.
  • [20] A. Bujnowski et al., “Smart weighing scale with feet-sampled ECG,” in Proceedings: IECON 2018 - 44th Annual Conference of the IEEE Industrial Electronics Society, Washington DC, USA, Dec. 2018, pp. 3286–3291.
  • [21] Karadeniz Teknik Üniversitesi Bilgisayar Mühendisliği Bölümü Mikroişlemciler Laboratuarı. (2017, 6 Ekim). Arm Mimarisi ve Uygulamaları [Çevrimiçi]. Erişim: https://www.ktu.edu.tr/dosyalar/bilgisayar_a7670.pdf .
  • [22] S. Wang, Z. Mao, C. Zeng, H. Gong, S. Li, and B. Chen, “A new method of virtual reality based on Unity3D,” in 18th international conference on Geoinformatics, Beijing, China, 2010 pp. 1–5.
There are 22 citations in total.

Details

Primary Language Turkish
Subjects Engineering
Journal Section Articles
Authors

Ferdi Alakus This is me 0000-0002-6096-4659

Ali Hakan Isık 0000-0003-3561-9375

Ömer Can Eskicioğlu 0000-0001-5644-2957

Project Number 3200662
Publication Date December 31, 2021
Published in Issue Year 2021 Volume: 9 Issue: 6 - ICAIAME 2021

Cite

APA Alakus, F., Isık, A. H., & Eskicioğlu, Ö. C. (2021). Hareket Yakalama ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon. Düzce Üniversitesi Bilim Ve Teknoloji Dergisi, 9(6), 269-279. https://doi.org/10.29130/dubited.1015397
AMA Alakus F, Isık AH, Eskicioğlu ÖC. Hareket Yakalama ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon. DUBİTED. December 2021;9(6):269-279. doi:10.29130/dubited.1015397
Chicago Alakus, Ferdi, Ali Hakan Isık, and Ömer Can Eskicioğlu. “Hareket Yakalama Ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon”. Düzce Üniversitesi Bilim Ve Teknoloji Dergisi 9, no. 6 (December 2021): 269-79. https://doi.org/10.29130/dubited.1015397.
EndNote Alakus F, Isık AH, Eskicioğlu ÖC (December 1, 2021) Hareket Yakalama ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon. Düzce Üniversitesi Bilim ve Teknoloji Dergisi 9 6 269–279.
IEEE F. Alakus, A. H. Isık, and Ö. C. Eskicioğlu, “Hareket Yakalama ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon”, DUBİTED, vol. 9, no. 6, pp. 269–279, 2021, doi: 10.29130/dubited.1015397.
ISNAD Alakus, Ferdi et al. “Hareket Yakalama Ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon”. Düzce Üniversitesi Bilim ve Teknoloji Dergisi 9/6 (December 2021), 269-279. https://doi.org/10.29130/dubited.1015397.
JAMA Alakus F, Isık AH, Eskicioğlu ÖC. Hareket Yakalama ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon. DUBİTED. 2021;9:269–279.
MLA Alakus, Ferdi et al. “Hareket Yakalama Ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon”. Düzce Üniversitesi Bilim Ve Teknoloji Dergisi, vol. 9, no. 6, 2021, pp. 269-7, doi:10.29130/dubited.1015397.
Vancouver Alakus F, Isık AH, Eskicioğlu ÖC. Hareket Yakalama ve Sanal Gerçeklik Teknolojileri Kullanarak Oyun Tabanlı Rehabilitasyon. DUBİTED. 2021;9(6):269-7.