Araştırma Makalesi
BibTex RIS Kaynak Göster

A Smart Multimodal Augmented Reality Application Skill Training For Preoperative Procedures

Yıl 2020, , 57 - 63, 31.01.2020
https://doi.org/10.17671/gazibtd.642130

Öz

Despite the current essential rules to set up the surgical instruments on the mayo table, there is no standard implementation for this stage of the preoperative preparation. The nursing students do not have enough opportunity to gain experience because of the high-risk conditions and they faced a lack of feedback during their learning. This study introduces a smart multimodal augmented reality application which developed to solve these issues. 10-correct and 10-incorrect instrument arrangement of the expert nurses were used to train this system. During using this application, the learners become able to perform pre-operative operating room preparation procedures; then they can receive individual feedback about the suitability of their arrangements. Meta2 AR goggles used to provide gesture interaction. This study represents the interface, features and smart system structure of a multimodal AR.

Destekleyen Kurum

Marmara University

Proje Numarası

FEN-B-131216-0542; EGT-C-DRP-200716-0392

Kaynakça

  • R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, B. MacIntyre, “Recent advances in augmented reality,” IEEE Computer Graphics and Applications, 21(6), 34-47, 2001.
  • K. Gül, S. Şahin, “Bilgisayar Donanım Öğretimi için Artırılmış Gerçeklik Materyalinin Geliştirilmesi ve Etkililiğinin İncelenmesi”, Bilişim Teknolojileri Dergisi, 10 (4), 353-362, 2017. DOI: 10.17671/gazibtd.347604
  • T. İçten, G. Bal, “Artırılmış Gerçeklik Teknolojisi Üzerine Yapılan Akademik Çalışmaların İçerik Analizi”, Bilişim Teknolojileri Dergisi, 10(4), 401-415, 2017. DOI: 10.17671/gazibtd.290253
  • İ. Emre, M. Selçuk, V. Budak, M. Bütün, İ. Şimşek, “Eğitim Amaçlı Sanal Gerçeklik Uygulamalarında Kullanılan Cihazların Daldırma Açısından İncelenmesi”, Bilişim Teknolojileri Dergisi, 12(2), 119-129 2019. DOI: 10.17671/gazibtd.453381
  • J. R. Savery, T. M. Duffy, “Problem-based learning: An instructional model and its constructivist framework”, Educational Technology, 25(5), 31-38, 1995.
  • J. S. Kim, “The effects of a constructivist teaching approach on student academic achievement, self-concept, and learning strategies”, Asia Pacific Education Review, 6(1), 7-19, 2005.
  • P. A. Kirschner, J. Sweller, R. E. Clark, “Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching”, Educational Psychologist, 41(2), 75-86, 2006.
  • V. Ramasundaram, S. Grunwald, A. Mangeot, N. B. Comerford, C. Bliss, “Development of an environmental virtual field laboratory”, Computers & Education, 45(1), 21-34, 2005.
  • S. Mills, J. Noyes, “Virtual reality: an overview of user-related design issues revised paper for a special issue on “Virtual reality: User Issues”, Interacting with Computers, 11(4), 375-386, 1999.
  • A. K. B. G. Bharathi, C. S. Tucker, "Investigating the impact of interactive immersive virtual reality environments in enhancing task performance in online engineering design activities." International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, 2015.
  • C. C. Jen, "Formative research on the instructional design process of virtual reality based learning environments”, 2007.
  • R. Appelman, “Designing experiential modes: A key focus for immersive learning environments”, TechTrends, 49(3), 64-74, 2005.
  • O. Osuagwu, C. Ihedigbo, C. Ndigwe, “Integrating Virtual Reality (VR) into traditional instructional design”, West African Journal of Industrial and Academic Research, 15(1), 68-77, 2015.
  • K. Hanson, B. E. Shelton, “Design and Development of Virtual Reality: Analysis of Challenges Faced by Educators”, Educational Technology & Society, 11(1), 118-131, 2008.
  • G. Makransky, T. S. Terkildsen, R. E. Mayer, “Adding immersive virtual reality to a science lab simulation causes more presence but less learning”, Learning and Instruction, 2017.
  • H.-H. Liou, S. J. Yang, S. Y. Chen, W. Tarng, “The influences of the 2D image-based augmented reality and virtual reality on student learning”, Journal of Educational Technology & Society, 20(3), 110-121, 2017.
  • S.-J. Lu, Y.-C. Liu, “Integrating augmented reality technology to enhance children’s learning in marine education”, Environmental Education Research, 21(4), 525-541, 2015.
  • D. Pérez-López, M. Contero, “Delivering educational multimedia contents through an augmented reality application: A case study on its impact on knowledge acquisition and retention”, Turkish Online Journal of Educational Technology-TOJET, 12(4), 19-28, 2013.
  • N. R. Herga, B. Čagran, D. Dinevski, “Virtual laboratory in the role of dynamic visualisation for better understanding of chemistry in primary school”, Eurasia Journal of Mathematics, Science & Technology Education, 12(3), 593-608, 2016.
  • K. M. Chuah, C. J. Chen, C.-S. Teh, “Designing a desktop virtual reality-based learning environment with emotional consideration”, Research and Practice in Technology Enhanced Learning, 6(1), 25-42, 2011.
  • Internet: Gartner, 5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies, 2018, https://www.gartner.com/smarterwithgartner/5-trends-emerge-in-gartner-hype-cycle-for-emerging-technologies-2018/, 13.03.2019.
  • J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, M. Ivkovic, “Augmented reality technologies, systems and applications”, Multimedia Tools and Applications, 51(1), 341-377, 2011.
  • R. Altosaar, A. Tindale, J. Doyle. “Physically Colliding with Music: Full-body Interactions with an Audio-only Virtual Reality Interface”, Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, ACM, 2019.
  • K.-J. Wang, C. Y. Zheng, Z.-H. Mao, "Human-Centered, Ergonomic Wearable Device with Computer Vision Augmented Intelligence for VR Multimodal Human-Smart Home Object Interaction.", 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 767-768, 2019.
  • A. W. Ismail, M. Billinghurst, M. S. Sunar, C. S. Yusof, "Designing an Augmented Reality Multimodal Interface for 6DOF Manipulation Techniques", Proceedings of SAI Intelligent Systems Conference, 309-322, 2018.
  • Z. Chen, J. Li, Y. Hua, R. Shen, A. Basu, "Multimodal interaction in augmented reality", 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 206-209, 2017.
  • Y. Wang, H. Zeng, A. Song, B. Xu, H. Li, L. Zhu, P. Wen, J. Liu, "Robotic arm control using hybrid brain-machine interface and augmented reality feedback", 8th International IEEE/EMBS Conference on Neural Engineering (NER), 411-414, 2017.
  • H. Zeng, Y. Wang, C. Wu, A. Song, J. Liu, P. Ji, B. Xu, L. Zhu, H. Li, P. Wen, “Closed-loop hybrid gaze brain-machine interface based robotic arm control with augmented reality feedback”, Frontiers in neurorobotics, 11(1), 60, 2017.
  • P. Gang, J. Hui, S. Stirenko, Y. Gordienko, T. Shemsedinov, O. Alienin, Y. Kochura, N. Gordienko, A. Rojbi, J. L. Benito, "User-driven intelligent interface on the basis of multimodal augmented reality and brain-computer interaction for people with functional disabilities", Future of Information and Communication Conference, 612-631, 2018.
  • S. Stirenko, Y. Gordienko, T. Shemsedinov, O. Alienin, Y. Kochura, N. Gordienko, A. Rojbi, J. Benito, E. A. González, “User-driven intelligent interface on the basis of multimodal augmented reality and brain-computer interaction for people with functional disabilities”, arXiv preprint arXiv:1704.05915, 2017.
  • A. Pereira, E. J. Carter, I. Leite, J. Mars, J. F. Lehman, "Augmented reality dialog interface for multimodal teleoperation", 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 764-771, 2017.
  • A. Karambakhsh, A. Kamel, B. Sheng, P. Li, P. Yang, D. D. Feng, “Deep gesture interaction for augmented anatomy learning”, International Journal of Information Management, 45(1), 328-336, 2019.
  • Z. Taçgın, Ameliyathanede kullanılan cerrahi setlerin öğretimine yönelik bir sanal gerçeklik simülasyonunun geliştirilmesi ve değerlendirilmesi, PhD Thesis, Marmara University, Educational Science Institute, 2017.

Akıllı Bir Çoklu Model Arttırılmış Gerçeklik Uygulaması Ameliyat Öncesi Prosedür Becerilerinin Öğretilmesi

Yıl 2020, , 57 - 63, 31.01.2020
https://doi.org/10.17671/gazibtd.642130

Öz

Ameliyat öncesi hazırlıkta cerrahi aletlerin mayo masasına dizilimi için temel kurallar bulunsa da belirli bir standart bulunmamaktadır. Hemşirelik öğrencileri yüksek riskli durumlar ve öğrenmeleri sırasında yetersiz geribildirim alma gibi sınırlılıklar sebebiyle gereken tecrübeyi kazanmak için yeterli olanaklara sahip değildir. Bu çalışma söz konusu bu sorunun çözümü için geliştirilen akili bir çoklu model artırılmış gerçeklik uygulamasını tanıtmaktadır. Sistem uzman cerrahi hemşirelerin dizilimlerini kullanarak oluşturulan 10 doğru ve 10 hatalı dizilimle eğitilmiştir. Bu uygulamayı kullanan öğrenciler ameliyat öncesi hazinlik aşamalarını gerçekleştirdikten sonra kendi dizilimlerinin uygunluğuna ilişkin bireysel geribildirim alabilmektedir. Fiziksel ek hareketleri ile etkileşimin sağlandığı bu uygulama, Meta2 artırılmış gerçeklik gözlüğü kullanılarak geliştirilmiştir. Bu çalışma, geliştirilen çoklu model artırılmış gerçeklik uygulamasının ara yüzü, özellikleri ve akıllı sistem yapısını açıklamaktadır.

Proje Numarası

FEN-B-131216-0542; EGT-C-DRP-200716-0392

Kaynakça

  • R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, B. MacIntyre, “Recent advances in augmented reality,” IEEE Computer Graphics and Applications, 21(6), 34-47, 2001.
  • K. Gül, S. Şahin, “Bilgisayar Donanım Öğretimi için Artırılmış Gerçeklik Materyalinin Geliştirilmesi ve Etkililiğinin İncelenmesi”, Bilişim Teknolojileri Dergisi, 10 (4), 353-362, 2017. DOI: 10.17671/gazibtd.347604
  • T. İçten, G. Bal, “Artırılmış Gerçeklik Teknolojisi Üzerine Yapılan Akademik Çalışmaların İçerik Analizi”, Bilişim Teknolojileri Dergisi, 10(4), 401-415, 2017. DOI: 10.17671/gazibtd.290253
  • İ. Emre, M. Selçuk, V. Budak, M. Bütün, İ. Şimşek, “Eğitim Amaçlı Sanal Gerçeklik Uygulamalarında Kullanılan Cihazların Daldırma Açısından İncelenmesi”, Bilişim Teknolojileri Dergisi, 12(2), 119-129 2019. DOI: 10.17671/gazibtd.453381
  • J. R. Savery, T. M. Duffy, “Problem-based learning: An instructional model and its constructivist framework”, Educational Technology, 25(5), 31-38, 1995.
  • J. S. Kim, “The effects of a constructivist teaching approach on student academic achievement, self-concept, and learning strategies”, Asia Pacific Education Review, 6(1), 7-19, 2005.
  • P. A. Kirschner, J. Sweller, R. E. Clark, “Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching”, Educational Psychologist, 41(2), 75-86, 2006.
  • V. Ramasundaram, S. Grunwald, A. Mangeot, N. B. Comerford, C. Bliss, “Development of an environmental virtual field laboratory”, Computers & Education, 45(1), 21-34, 2005.
  • S. Mills, J. Noyes, “Virtual reality: an overview of user-related design issues revised paper for a special issue on “Virtual reality: User Issues”, Interacting with Computers, 11(4), 375-386, 1999.
  • A. K. B. G. Bharathi, C. S. Tucker, "Investigating the impact of interactive immersive virtual reality environments in enhancing task performance in online engineering design activities." International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, 2015.
  • C. C. Jen, "Formative research on the instructional design process of virtual reality based learning environments”, 2007.
  • R. Appelman, “Designing experiential modes: A key focus for immersive learning environments”, TechTrends, 49(3), 64-74, 2005.
  • O. Osuagwu, C. Ihedigbo, C. Ndigwe, “Integrating Virtual Reality (VR) into traditional instructional design”, West African Journal of Industrial and Academic Research, 15(1), 68-77, 2015.
  • K. Hanson, B. E. Shelton, “Design and Development of Virtual Reality: Analysis of Challenges Faced by Educators”, Educational Technology & Society, 11(1), 118-131, 2008.
  • G. Makransky, T. S. Terkildsen, R. E. Mayer, “Adding immersive virtual reality to a science lab simulation causes more presence but less learning”, Learning and Instruction, 2017.
  • H.-H. Liou, S. J. Yang, S. Y. Chen, W. Tarng, “The influences of the 2D image-based augmented reality and virtual reality on student learning”, Journal of Educational Technology & Society, 20(3), 110-121, 2017.
  • S.-J. Lu, Y.-C. Liu, “Integrating augmented reality technology to enhance children’s learning in marine education”, Environmental Education Research, 21(4), 525-541, 2015.
  • D. Pérez-López, M. Contero, “Delivering educational multimedia contents through an augmented reality application: A case study on its impact on knowledge acquisition and retention”, Turkish Online Journal of Educational Technology-TOJET, 12(4), 19-28, 2013.
  • N. R. Herga, B. Čagran, D. Dinevski, “Virtual laboratory in the role of dynamic visualisation for better understanding of chemistry in primary school”, Eurasia Journal of Mathematics, Science & Technology Education, 12(3), 593-608, 2016.
  • K. M. Chuah, C. J. Chen, C.-S. Teh, “Designing a desktop virtual reality-based learning environment with emotional consideration”, Research and Practice in Technology Enhanced Learning, 6(1), 25-42, 2011.
  • Internet: Gartner, 5 Trends Emerge in the Gartner Hype Cycle for Emerging Technologies, 2018, https://www.gartner.com/smarterwithgartner/5-trends-emerge-in-gartner-hype-cycle-for-emerging-technologies-2018/, 13.03.2019.
  • J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, M. Ivkovic, “Augmented reality technologies, systems and applications”, Multimedia Tools and Applications, 51(1), 341-377, 2011.
  • R. Altosaar, A. Tindale, J. Doyle. “Physically Colliding with Music: Full-body Interactions with an Audio-only Virtual Reality Interface”, Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, ACM, 2019.
  • K.-J. Wang, C. Y. Zheng, Z.-H. Mao, "Human-Centered, Ergonomic Wearable Device with Computer Vision Augmented Intelligence for VR Multimodal Human-Smart Home Object Interaction.", 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 767-768, 2019.
  • A. W. Ismail, M. Billinghurst, M. S. Sunar, C. S. Yusof, "Designing an Augmented Reality Multimodal Interface for 6DOF Manipulation Techniques", Proceedings of SAI Intelligent Systems Conference, 309-322, 2018.
  • Z. Chen, J. Li, Y. Hua, R. Shen, A. Basu, "Multimodal interaction in augmented reality", 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), 206-209, 2017.
  • Y. Wang, H. Zeng, A. Song, B. Xu, H. Li, L. Zhu, P. Wen, J. Liu, "Robotic arm control using hybrid brain-machine interface and augmented reality feedback", 8th International IEEE/EMBS Conference on Neural Engineering (NER), 411-414, 2017.
  • H. Zeng, Y. Wang, C. Wu, A. Song, J. Liu, P. Ji, B. Xu, L. Zhu, H. Li, P. Wen, “Closed-loop hybrid gaze brain-machine interface based robotic arm control with augmented reality feedback”, Frontiers in neurorobotics, 11(1), 60, 2017.
  • P. Gang, J. Hui, S. Stirenko, Y. Gordienko, T. Shemsedinov, O. Alienin, Y. Kochura, N. Gordienko, A. Rojbi, J. L. Benito, "User-driven intelligent interface on the basis of multimodal augmented reality and brain-computer interaction for people with functional disabilities", Future of Information and Communication Conference, 612-631, 2018.
  • S. Stirenko, Y. Gordienko, T. Shemsedinov, O. Alienin, Y. Kochura, N. Gordienko, A. Rojbi, J. Benito, E. A. González, “User-driven intelligent interface on the basis of multimodal augmented reality and brain-computer interaction for people with functional disabilities”, arXiv preprint arXiv:1704.05915, 2017.
  • A. Pereira, E. J. Carter, I. Leite, J. Mars, J. F. Lehman, "Augmented reality dialog interface for multimodal teleoperation", 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 764-771, 2017.
  • A. Karambakhsh, A. Kamel, B. Sheng, P. Li, P. Yang, D. D. Feng, “Deep gesture interaction for augmented anatomy learning”, International Journal of Information Management, 45(1), 328-336, 2019.
  • Z. Taçgın, Ameliyathanede kullanılan cerrahi setlerin öğretimine yönelik bir sanal gerçeklik simülasyonunun geliştirilmesi ve değerlendirilmesi, PhD Thesis, Marmara University, Educational Science Institute, 2017.
Toplam 33 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Bilgisayar Yazılımı
Bölüm Makaleler
Yazarlar

Zeynep Taçgın 0000-0001-7827-8238

Ertugrul Tacgın

Proje Numarası FEN-B-131216-0542; EGT-C-DRP-200716-0392
Yayımlanma Tarihi 31 Ocak 2020
Gönderilme Tarihi 5 Kasım 2019
Yayımlandığı Sayı Yıl 2020

Kaynak Göster

APA Taçgın, Z., & Tacgın, E. (2020). A Smart Multimodal Augmented Reality Application Skill Training For Preoperative Procedures. Bilişim Teknolojileri Dergisi, 13(1), 57-63. https://doi.org/10.17671/gazibtd.642130