Research Article
BibTex RIS Cite

Sensing Glove Study for Augmented Reality Environments

Year 2017, Volume: 6 Issue: 2, 44 - 49, 20.07.2017

Abstract

Computer, mobile devices and
the newly emerging wearable technologies motivate the researchers to create new
environments. Smart glasses, which are one of the current wearable
technologies, are among the developing technologies. Studies have been conducted
for the usability and functionality of this technology, which is used in
augmented reality environments, in education as well as many other areas. Some
of these studies focus on users' control over digital objects. In this context,
object control of augmented reality environments through various sensors such
as RGB-D, heat or movement sensors were tried to be established in this study.
The aim of the current study to develop a sensing glove which is a wearable
technology for augmented reality environments. The Unity and Vuforio SDK
software was integrated with Epson Moverio BT-200 and various sensors such as
Accelerometer, Flex, and Force etc. that are being developed. A sensing glove
was prepared for the user in the augmented reality environment. An environment
including alternative, new and up-to-date applications that can be used in
augmented and virtual environments was provided for developers with the new
hardware and software.

References

  • Bäckman, M., & Tenfält, A.(2015) Exploring the Opportunities of Wearable Technology within Siemens AG. Master of Science Thesis Department of Technology Management and Economics Division of Innovation Engineering and Management Chalmers University of Technology Göteborg, Sverige. K. Li, I.-M. Chen, S. H. Yeo, & C. K. Lim (2011). Development of finger-motion capturing device based on optical linear encoder. Journal of rehabilitation research and development, 48(1), 69.
  • Mitobe, K., Kaiga, T., Yukawa, T., Miura, T., Tamamoto, H., Rodgers, A., & Yoshimura, N. (2006, July). Development of a motion capture system for a hand using a magnetic three dimensional position sensor. In SIGGRAPH Research Posters (p. 102).
  • Nishiyama, M., & Watanabe, K. (2009). Wearable sensing glove with embedded hetero-core fiber-optic nerves for unconstrained hand motion capture. IEEE Transactions on Instrumentation and Measurement, 58(12), 3995-4000.
  • Mizuchi, Y., Kawai, H., Hagiwara, Y., Imamura, H., & Choi, Y. (2010, October). Estimation for a hand position and orientation robust to motion of fingers using a web camera. In Control Automation and Systems (ICCAS), 2010 International Conference on (pp. 187-190). IEEE.
  • Metcalf, C. D., Robinson, R., Malpass, A. J., Bogle, T. P., Dell, T. A., Harris, C., & Demain, S. H. (2013). Markerless motion capture and measurement of hand kinematics: validation and application to home-based upper limb rehabilitation. IEEE Transactions on Biomedical Engineering, 60(8), 2184-2192.
  • Shim, J., Yang, Y., Kang, N., Seo, J., & Han, T. D. (2016). Gesture-based interactive augmented reality content authoring system using HMD. Virtual Reality, 20(1), 57-69.
  • Shen Y., Ong S. K. & Nee A. Y. C. (2011) Vision-Based Hand Interaction in Augmented Reality Environment, International Journal of Human–Computer Interaction, 27:6, 523-544.
  • Rauschnabel, P. A., Brem, A., & Ivens, B. S. (2015). Who will buy smart glasses? Empirical results of two pre-market-entry studies on the role of personality in individual awareness and intended adoption of Google Glass wearables. Computers in Human Behavior, 49, 635-647.
  • Mistry, P., & Maes, P. (2009, December). SixthSense: a wearable gestural interface. In ACM SIGGRAPH ASIA 2009 Sketches (p. 11). ACM. Mann, S. (1997). Wearable computing: A first step toward personal imaging. Computer, 30(2), 25-32.
  • Kim, M., & Lee, J. Y. (2016). Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality. Multimedia Tools and Applications, 75(23), 16529-16550.
  • Jimeno-Morenilla, A., Sánchez-Romero, J. L., & Salas-Pérez, F. (2013). Augmented and Virtual Reality techniques for footwear. Computers in Industry, 64(9), 1371-1382.
  • Zhao, M. Y., Ong, S. K., & Nee, A. Y. C. (2016). An Augmented Reality-Assisted Therapeutic Healthcare Exercise System Based on Bare-Hand Interaction. International Journal of Human-Computer Interaction, 32(9), 708-721.
  • Seo, D. W., & Lee, J. Y. (2013). Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences. Expert Systems with Applications, 40(9), 3784-3793.
  • Ben-Tzvi, P., & Ma, Z. (2015). Sensing and Force-Feedback Exoskeleton (SAFE) Robotic Glove. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23(6), 992-1002.
  • Nee, A. Y. C., Ong, S. K., Chryssolouris, G., & Mourtzis, D. (2012). Augmented reality applications in design and manufacturing. CIRP Annals-manufacturing technology, 61(2), 657-679.
  • Brancati, N., Caggianese, G., Frucci, M., Gallo, L., & Neroni, P. (2015, June). Robust fingertip detection in egocentric vision under varying illumination conditions. In Multimedia & Expo Workshops (ICMEW), 2015 IEEE International Conference on (pp. 1-6). IEEE.
  • Nogueira, Y. L., Freitas, M. O., Vidal, C. A., & Cavalcante-Neto, J. B. (2010). Arpong: A viability case study of markerless ar-based interface to computer games with webcam. IX SBGames-Florianópolis-SC., p.236-239.
  • Hong, J., & Tan, X. (1989, May). Calibrating a VPL DataGlove for teleoperating the Utah/MIT hand. In Robotics and Automation, 1989. Proceedings., 1989 IEEE International Conference (pp. 1752-1757). IEEE.
  • Kadous, M. W. (1996, October). Machine recognition of Auslan signs using PowerGloves: Towards large-lexicon recognition of sign language. In Proceedings of the Workshop on the Integration of Gesture in Language and Speech (pp. 165-174).
  • Lee, T., & Hollerer, T. (2007, October). Handy AR: Markerless inspection of augmented reality objects using fingertip tracking. In Wearable Computers, 2007 11th IEEE International Symposium (pp. 83-90). IEEE.
  • Wang, Z., Shen, Y., Ong, S. K., & Nee, A. Y. C. (2009, September). Assembly design and evaluation based on bare-hand interaction in an augmented reality environment. In CyberWorlds, 2009. CW'09. International Conference (pp. 21-28). IEEE.
  • Asai, S., & Hirakawa, M. (2009, December). Dual face interaction in handheld augmented reality environments. In Multimedia, 2009. ISM'09. 11th IEEE International Symposium (pp. 187-193). IEEE.
  • Huang, F., Zhou, Y., Yu, Y., Wang, Z., & Du, S. (2011, August). Piano ar: A markerless augmented reality based piano teaching system. In Intelligent Human-Machine Systems and Cybernetics (IHMSC), 2011 International Conference (Vol. 2, pp. 47-52). IEEE.
  • Yang, M. T., Liao, W. C., & Shih, Y. C. (2013, July). VECAR: Virtual English classroom with markerless augmented reality and intuitive gesture interaction. In Advanced Learning Technologies (ICALT), 2013 IEEE 13th International Conference (pp. 439-440). IEEE
  • Ha, T., Feiner, S., & Woo, W. (2014, September). WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception. In Mixed and Augmented Reality (ISMAR), 2014 IEEE International Symposium (pp. 219-228). IEEE.
  • Lv, Z., Feng, S., Feng, L., & Li, H. (2015, March). Extending touch-less interaction on vision based wearable device. In 2015 IEEE Virtual Reality (VR)(pp. 231-232). IEEE.
  • Elder, S., & Vakaloudis, A. (2015, September). A technical evaluation of devices for smart glasses applications. In Internet Technologies and Applications (ITA), 2015 (pp. 98-103). IEEE.
  • DIGI-CAPITAL (2016, July 15) , After mixed year, mobile AR to drive $108 billion VR/AR market by 2021, Retrieved from http://www.digi-capital.com/news/2015/04/augmentedvirtual-reality-to-hit-150-billiondisrupting-mobile-by-2020/#.V4kmX-uLS00,
Year 2017, Volume: 6 Issue: 2, 44 - 49, 20.07.2017

Abstract

References

  • Bäckman, M., & Tenfält, A.(2015) Exploring the Opportunities of Wearable Technology within Siemens AG. Master of Science Thesis Department of Technology Management and Economics Division of Innovation Engineering and Management Chalmers University of Technology Göteborg, Sverige. K. Li, I.-M. Chen, S. H. Yeo, & C. K. Lim (2011). Development of finger-motion capturing device based on optical linear encoder. Journal of rehabilitation research and development, 48(1), 69.
  • Mitobe, K., Kaiga, T., Yukawa, T., Miura, T., Tamamoto, H., Rodgers, A., & Yoshimura, N. (2006, July). Development of a motion capture system for a hand using a magnetic three dimensional position sensor. In SIGGRAPH Research Posters (p. 102).
  • Nishiyama, M., & Watanabe, K. (2009). Wearable sensing glove with embedded hetero-core fiber-optic nerves for unconstrained hand motion capture. IEEE Transactions on Instrumentation and Measurement, 58(12), 3995-4000.
  • Mizuchi, Y., Kawai, H., Hagiwara, Y., Imamura, H., & Choi, Y. (2010, October). Estimation for a hand position and orientation robust to motion of fingers using a web camera. In Control Automation and Systems (ICCAS), 2010 International Conference on (pp. 187-190). IEEE.
  • Metcalf, C. D., Robinson, R., Malpass, A. J., Bogle, T. P., Dell, T. A., Harris, C., & Demain, S. H. (2013). Markerless motion capture and measurement of hand kinematics: validation and application to home-based upper limb rehabilitation. IEEE Transactions on Biomedical Engineering, 60(8), 2184-2192.
  • Shim, J., Yang, Y., Kang, N., Seo, J., & Han, T. D. (2016). Gesture-based interactive augmented reality content authoring system using HMD. Virtual Reality, 20(1), 57-69.
  • Shen Y., Ong S. K. & Nee A. Y. C. (2011) Vision-Based Hand Interaction in Augmented Reality Environment, International Journal of Human–Computer Interaction, 27:6, 523-544.
  • Rauschnabel, P. A., Brem, A., & Ivens, B. S. (2015). Who will buy smart glasses? Empirical results of two pre-market-entry studies on the role of personality in individual awareness and intended adoption of Google Glass wearables. Computers in Human Behavior, 49, 635-647.
  • Mistry, P., & Maes, P. (2009, December). SixthSense: a wearable gestural interface. In ACM SIGGRAPH ASIA 2009 Sketches (p. 11). ACM. Mann, S. (1997). Wearable computing: A first step toward personal imaging. Computer, 30(2), 25-32.
  • Kim, M., & Lee, J. Y. (2016). Touch and hand gesture-based interactions for directly manipulating 3D virtual objects in mobile augmented reality. Multimedia Tools and Applications, 75(23), 16529-16550.
  • Jimeno-Morenilla, A., Sánchez-Romero, J. L., & Salas-Pérez, F. (2013). Augmented and Virtual Reality techniques for footwear. Computers in Industry, 64(9), 1371-1382.
  • Zhao, M. Y., Ong, S. K., & Nee, A. Y. C. (2016). An Augmented Reality-Assisted Therapeutic Healthcare Exercise System Based on Bare-Hand Interaction. International Journal of Human-Computer Interaction, 32(9), 708-721.
  • Seo, D. W., & Lee, J. Y. (2013). Direct hand touchable interactions in augmented reality environments for natural and intuitive user experiences. Expert Systems with Applications, 40(9), 3784-3793.
  • Ben-Tzvi, P., & Ma, Z. (2015). Sensing and Force-Feedback Exoskeleton (SAFE) Robotic Glove. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 23(6), 992-1002.
  • Nee, A. Y. C., Ong, S. K., Chryssolouris, G., & Mourtzis, D. (2012). Augmented reality applications in design and manufacturing. CIRP Annals-manufacturing technology, 61(2), 657-679.
  • Brancati, N., Caggianese, G., Frucci, M., Gallo, L., & Neroni, P. (2015, June). Robust fingertip detection in egocentric vision under varying illumination conditions. In Multimedia & Expo Workshops (ICMEW), 2015 IEEE International Conference on (pp. 1-6). IEEE.
  • Nogueira, Y. L., Freitas, M. O., Vidal, C. A., & Cavalcante-Neto, J. B. (2010). Arpong: A viability case study of markerless ar-based interface to computer games with webcam. IX SBGames-Florianópolis-SC., p.236-239.
  • Hong, J., & Tan, X. (1989, May). Calibrating a VPL DataGlove for teleoperating the Utah/MIT hand. In Robotics and Automation, 1989. Proceedings., 1989 IEEE International Conference (pp. 1752-1757). IEEE.
  • Kadous, M. W. (1996, October). Machine recognition of Auslan signs using PowerGloves: Towards large-lexicon recognition of sign language. In Proceedings of the Workshop on the Integration of Gesture in Language and Speech (pp. 165-174).
  • Lee, T., & Hollerer, T. (2007, October). Handy AR: Markerless inspection of augmented reality objects using fingertip tracking. In Wearable Computers, 2007 11th IEEE International Symposium (pp. 83-90). IEEE.
  • Wang, Z., Shen, Y., Ong, S. K., & Nee, A. Y. C. (2009, September). Assembly design and evaluation based on bare-hand interaction in an augmented reality environment. In CyberWorlds, 2009. CW'09. International Conference (pp. 21-28). IEEE.
  • Asai, S., & Hirakawa, M. (2009, December). Dual face interaction in handheld augmented reality environments. In Multimedia, 2009. ISM'09. 11th IEEE International Symposium (pp. 187-193). IEEE.
  • Huang, F., Zhou, Y., Yu, Y., Wang, Z., & Du, S. (2011, August). Piano ar: A markerless augmented reality based piano teaching system. In Intelligent Human-Machine Systems and Cybernetics (IHMSC), 2011 International Conference (Vol. 2, pp. 47-52). IEEE.
  • Yang, M. T., Liao, W. C., & Shih, Y. C. (2013, July). VECAR: Virtual English classroom with markerless augmented reality and intuitive gesture interaction. In Advanced Learning Technologies (ICALT), 2013 IEEE 13th International Conference (pp. 439-440). IEEE
  • Ha, T., Feiner, S., & Woo, W. (2014, September). WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception. In Mixed and Augmented Reality (ISMAR), 2014 IEEE International Symposium (pp. 219-228). IEEE.
  • Lv, Z., Feng, S., Feng, L., & Li, H. (2015, March). Extending touch-less interaction on vision based wearable device. In 2015 IEEE Virtual Reality (VR)(pp. 231-232). IEEE.
  • Elder, S., & Vakaloudis, A. (2015, September). A technical evaluation of devices for smart glasses applications. In Internet Technologies and Applications (ITA), 2015 (pp. 98-103). IEEE.
  • DIGI-CAPITAL (2016, July 15) , After mixed year, mobile AR to drive $108 billion VR/AR market by 2021, Retrieved from http://www.digi-capital.com/news/2015/04/augmentedvirtual-reality-to-hit-150-billiondisrupting-mobile-by-2020/#.V4kmX-uLS00,
There are 28 citations in total.

Details

Journal Section FIFTEENTH ISSUE
Authors

Mustafa Serkan Abdüsselam

Publication Date July 20, 2017
Published in Issue Year 2017 Volume: 6 Issue: 2

Cite

APA Abdüsselam, M. S. (2017). Sensing Glove Study for Augmented Reality Environments. Journal of Instructional Technologies and Teacher Education, 6(2), 44-49.