Research Article
BibTex RIS Cite

Year 2025, Volume: 13 Issue: 3, 910 - 926, 01.09.2025
https://doi.org/10.36306/konjes.1619363

Abstract

References

  • P. Majaranta and A. Bulling, “Eye Tracking and Eye-Based Human–Computer Interaction BT - Advances in Physiological Computing,” S. H. Fairclough and K. Gilleade, Eds., London: Springer London, 2014, pp. 39–65. doi: 10.1007/978-1-4471-6392-3_3.
  • E. Skodras and N. Fakotakis, “Precise localization of eye centers in low resolution color images,” Image Vis Comput, vol. 36, pp. 51–60, 2015, doi: 10.1016/j.imavis.2015.01.006.
  • A. Al-Rahayfeh and M. Faezipour, “Eye tracking and head movement detection: A state-of-art survey,” IEEE J Transl Eng Health Med, vol. 1, no. May, pp. 11–22, 2013, doi: 10.1109/JTEHM.2013.2289879.
  • D. W. Hansen and Q. Ji, “In the Eye of the Beholder: A Survey of Models for Eyes and Gaze,” IEEE Trans Pattern Anal Mach Intell, vol. 32, no. 3, pp. 478–500, 2010, doi: 10.1109/TPAMI.2009.30.
  • M. Porta, P. Dondi, A. Pianetta, and V. Cantoni, “SPEye: A Calibration-Free Gaze-Driven Text Entry Technique Based on Smooth Pursuit,” IEEE Trans Hum Mach Syst, vol. 52, no. 2, pp. 312–323, 2022, doi: 10.1109/THMS.2021.3123202.
  • M. Mifsud, T. A. Camilleri, and K. P. Camilleri, “HMM-based gesture recognition for eye-swipe typing,” Biomed Signal Process Control, vol. 86, no. PA, p. 105161, 2023, doi: 10.1016/j.bspc.2023.105161.
  • M. Yildiz and M. Yorulmaz, “A Novel Gaze Input System Based on Iris Tracking with Webcam Mounted Eyeglasses,” Interact Comput, vol. 33, no. 2, pp. 211–222, 2021, doi: 10.1093/iwc/iwab022.
  • A. López, J. R. Villar, M. Fernández, and F. J. Ferrero, “Comparison of classification techniques for the control of EOG-based HCIs,” Biomed Signal Process Control, vol. 80, no. August 2022, 2023, doi: 10.1016/j.bspc.2022.104263.
  • M. Yildiz and H. Ö. Ülkütaş, “A New PC-Based Text Entry System Based on EOG Coding,” Advances in Human-Computer Interaction, vol. 2018, 2018, doi: 10.1155/2018/8528176.
  • N. Barbara, T. A. Camilleri, and K. P. Camilleri, “EOG-based eye movement detection and gaze estimation for an asynchronous virtual keyboard,” Biomed Signal Process Control, vol. 47, pp. 159–167, 2019, doi: 10.1016/j.bspc.2018.07.005.
  • S. M. Hosni, H. A. Shedeed, M. S. Mabrouk, and M. F. Tolba, “EEG-EOG based Virtual Keyboard: Toward Hybrid Brain Computer Interface,” Neuroinformatics, vol. 17, no. 3, pp. 323–341, 2019, doi: 10.1007/s12021-018-9402-0.
  • “LabVanced:Features.” Accessed: Jul. 09, 2021. [Online]. Available: https://www.labvanced.com/features.html
  • E. Skodras, V. G. Kanas, and N. Fakotakis, “On visual gaze tracking based on a single low cost camera,” Signal Process Image Commun, vol. 36, pp. 29–42, Aug. 2015, doi: 10.1016/J.IMAGE.2015.05.007.
  • A. B. Usakli and S. Gurkan, “Design of a novel efficient humancomputer interface: An electrooculagram based virtual keyboard,” IEEE Trans Instrum Meas, vol. 59, no. 8, pp. 2099–2108, 2010, doi: 10.1109/TIM.2009.2030923.
  • D. S. Nathan, A. P. Vinod, and K. P. Thomas, “An electrooculogram based assistive communication system with improved speed and accuracy using multi-directional eye movements,” 2012 35th International Conference on Telecommunications and Signal Processing, TSP 2012 - Proceedings, pp. 554–558, 2012, doi: 10.1109/TSP.2012.6256356.
  • S. He and Y. Li, “A single-channel EOG-based speller,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 11, pp. 1978–1987, 2017, doi: 10.1109/TNSRE.2017.2716109.
  • S. D and R. R. M, “A high performance asynchronous EOG speller system,” Biomed Signal Process Control, vol. 59, p. 101898, 2020, doi: 10.1016/j.bspc.2020.101898.
  • A. Huckauf and M. H. Urbina, “Gazing with pEYEs: Towards a universal input for various applications,” in Eye Tracking Research and Applications Symposium (ETRA), 2008, pp. 51–54. doi: 10.1145/1344471.1344483.
  • A. Dissertation, Päivi Majaranta Text Entry by Eye Gaze, no. 11. 2009.
  • P. O. Kristensson and K. Vertanen, “The potential of dwell-free eye-typing for fast assistive gaze communication,” in Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA ’12, 2012, p. 241. doi: 10.1145/2168556.2168605.
  • O. Tuisku, V. Surakka, V. Rantanen, T. Vanhala, and J. Lekkala, “Text entry by gazing and smiling,” Advances in Human-Computer Interaction, vol. 2013, 2013, doi: 10.1155/2013/218084.
  • D. Pedrosa, M. D. G. Pimentel, A. Wright, and K. N. Truong, “Filteryedping: Design challenges and user performance of dwell-free eye typing,” ACM Transactions on Accessible Computing (TACCESS), vol. 6, no. 1, pp. 1–37, 2015.
  • O. H. M. Lutz, A. C. Venjakob, and S. Ruff, “SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements,” J Eye Mov Res, vol. 8, no. 1, pp. 1–11, 2015, doi: 10.16910/jemr.8.1.2.
  • A. Kurauchi, “EyeSwipe : Dwell-free Text Entry Using Gaze Paths,” pp. 1952–1956, 2016.
  • Z. Zeng, X. Wang, F. W. Siebert, and H. Liu, “Enhancing Hybrid Eye Typing Interfaces with Word and Letter Prediction: A Comprehensive Evaluation,” Int J Hum Comput Interact, 2025, doi: 10.1080/10447318.2023.2297113.
  • P. Dondi, S. Sapuppo, and M. Porta, “Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed,” Int J Hum Comput Stud, vol. 184, p. 103204, Apr. 2024, doi: 10.1016/J.IJHCS.2023.103204.
  • M. Barrett, H. Skovsgaard, and J. S. Agustin, “Performance evaluation of a low-cost gaze tracker for eye typing,” Proceedings of the 5th Conference on Communication by Gaze Interaction, pp. 13–17, 2009.
  • C. Zhang, R. Yao, and J. Cai, “Efficient eye typing with 9-direction gaze estimation,” Multimed Tools Appl, vol. 77, no. 15, pp. 19679–19696, 2018, doi: 10.1007/s11042-017-5426-y.
  • R. Karatay, B. Demir, A. A. Ergin, and E. Erkan, “A real-time eye movement-based computer interface for people with disabilities,” Smart Health, vol. 34, p. 100521, Dec. 2024, doi: 10.1016/J.SMHL.2024.100521.
  • V. Emile Tatinyuy, A. V. Noumsi Woguia, J. M. Ngono, and L. A. Fono, “Multi-stage gaze-controlled virtual keyboard using eye tracking,” PLoS One, vol. 19, no. 10, p. e0309832, 2024, doi: 10.1371/JOURNAL.PONE.0309832.
  • G. R. Chhimpa, A. Kumar, S. Garhwal, and Dhiraj, “Empowering individuals with disabilities: a real-time, cost-effective, calibration-free assistive system utilizing eye tracking,” J Real Time Image Process, vol. 21, no. 3, pp. 1–16, May 2024, doi: 10.1007/S11554-024-01478-W/FIGURES/11.
  • Y. Liu, B. L. Deepu, R. Andrzej, and S. Martin, “CamType : assistive text entry using gaze with an off-the-shelf webcam,” Mach Vis Appl, vol. 30, no. 3, pp. 407–421, 2019, doi: 10.1007/s00138-018-00997-4.
  • Y. Durna and F. Arı, “Development and Application of Gaze Point Detection with Polynomial Functions,” Savunma Bilimleri Dergisi, vol. 15, no. 2, pp. 25–45, 2016.
  • K. Lee et al., “Eye-wearable head-mounted tracking and gaze estimation interactive machine system for human – machine interface,” 2021, doi: 10.1177/1461348419875047.
  • D. Li, J. Babcock, and D. J. Parkhurst, “openEyes: a low-cost head-mounted eye-tracking solution,” in ETRA 2006, 2006, pp. 27–29.
  • W. J. Ryan, A. T. Duchowski, and S. T. Birchfield, “Limbus/pupil switching for wearable eye tracking under variable lighting conditions,” in Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA ’08, 2008, p. 61. doi: 10.1145/1344471.1344487.
  • Y. Zhang, A. Bulling, and H. Gellersen, “Discrimination of gaze directions using low-level eye image features,” in Proceedings of the 1st international workshop on Pervasive eye tracking & mobile eye-based interaction - PETMEI ’11, 2011, p. 9. doi: 10.1145/2029956.2029961.
  • B. R. Pires, M. Hwangbo, M. Devyver, and T. Kanade, “Visible-spectrum gaze tracking for sports,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp. 1005–1010, 2013, doi: 10.1109/CVPRW.2013.146.
  • O. Mazhar, T. A. Shah, M. A. Khan, and S. Tehami, “A real-time webcam based Eye Ball Tracking System using MATLAB,” 2015 IEEE 21st International Symposium for Design and Technology in Electronic Packaging, SIITME 2015, pp. 139–142, 2015, doi: 10.1109/SIITME.2015.7342312.
  • E. Sümer, İ. B. Uslu, and M. Türker, “Research Article / Ara ş t ı rma Makalesi AN EYE-CONTROLLED WEARABLE COMMUNICATION AND CONTROL SYSTEM FOR ALS PATIENTS : SMARTEYES,” vol. 8, no. 2, pp. 107–116, 2017.
  • Z. Zhu and Q. Ji, “Novel eye gaze tracking techniques under natural head movement,” in IEEE Transactions on Biomedical Engineering, 2007, pp. 2246–2260. doi: 10.1109/TBME.2007.895750.
  • M. E. Dalkilic and G. Dalkılıç, “On the Cryptographic Patterns and Frequencies in Turkish Language,” in Lecture Notes in Computer Science, 2002. doi: 10.1007/3-540-36077-8.
  • P. Majaranta, Uii. K. Ahola, and O. Špakov, “Fast gaze typing with an adjustable dwell time,” Conference on Human Factors in Computing Systems - Proceedings, no. June, pp. 357–360, 2009, doi: 10.1145/1518701.1518758.
  • I. S. MacKenzie and R. W. Soukoreff, “Phrase sets for evaluating text entry techniques,” p. 754, 2003, doi: 10.1145/765968.765971.
  • R. Rutter, “List of Pangrams,” Clagnut. Accessed: Jan. 01, 2024. [Online]. Available: https://clagnut.com/blog/2380
  • I. S. Mackenzie and K. Tanaka-Ishii, Text Entry Systems Mobility, Accessibility, Universality. Morgan Kaufmann, 2007.
  • A. S. Arif and W. Stuerzlinger, “Analysis of text entry performance metrics,” TIC-STH’09: 2009 IEEE Toronto International Conference - Science and Technology for Humanity, pp. 100–105, 2009, doi: 10.1109/TIC-STH.2009.5444533.
  • M. J. Tsai, H. T. Hou, M. L. Lai, W. Y. Liu, and F. Y. Yang, “Visual attention for solving multiple-choice science problem: An eye-tracking analysis,” Comput Educ, vol. 58, no. 1, pp. 375–385, 2012, doi: 10.1016/j.compedu.2011.07.012.
  • B. Benligiray, C. Topal, and C. Akinlar, “SliceType: fast gaze typing with a merging keyboard,” Journal on Multimodal User Interfaces, vol. 13, no. 4, pp. 321–334, 2019, doi: 10.1007/s12193-018-0285-z.
  • Y. K. Meena, H. Cecotti, K. F. Wong-Lin, and G. Prasad, “Design and evaluation of a time adaptive multimodal virtual keyboard,” Journal on Multimodal User Interfaces, vol. 13, no. 4, pp. 343–361, 2019, doi: 10.1007/s12193-019-00293-z.
  • L. A. Farwell and E. Donchin, “Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials,” Electroencephalogr Clin Neurophysiol, vol. 70, no. 6, pp. 510–523, Dec. 1988, doi: 10.1016/0013-4694(88)90149-6.
  • X. Chen, Y. Wang, M. Nakanishi, X. Gao, T. P. Jung, and S. Gao, “High-speed spelling with a noninvasive brain-computer interface,” Proc Natl Acad Sci U S A, vol. 112, no. 44, pp. E6058–E6067, Nov. 2015, doi: 10.1073/PNAS.1508080112/SUPPL_FILE/PNAS.201508080SI.PDF.

NEW TURKISH VIRTUAL KEYBOARD APPLICATION USING SIMPLE CAMERA-MOUNTED EYEGLASSES

Year 2025, Volume: 13 Issue: 3, 910 - 926, 01.09.2025
https://doi.org/10.36306/konjes.1619363

Abstract

In recent years, many studies have been conducted to enable people with ALS and similar neuromuscular diseases to communicate by writing with their eyes. The methods used in the literature have various disadvantages: electrooculography-based methods involve electrical connections that come into contact with the person, infrared camera-based methods are commercial systems with high costs, and webcam-based methods have low performance. In this study, a Turkish virtual keyboard application that allows eye-typing based on an eye-tracking system created with a mini camera mounted on the frame of glasses was introduced, and its performance was tested. Experiments were conducted with 20 healthy volunteers in two sessions on different days, and typing speed, accuracy, and user satisfaction of the system were measured. In the first session, users were able to write the given pangram with an average speed of 6.19 words/minute and an accuracy of 94.1%, while in the second session, they were able to write it with an average speed of 6.74 words/minute and an accuracy of 95.5%. With the word completion feature added to the system, typing speed has increased even more. It has been evaluated that the developed system can be an alternative to commercial systems in terms of price performance.

Ethical Statement

The authors declare that all ethical guidelines including authorship, citation, data reporting, and publishing original research. Permission to conduct the human experiments was obtained from the Başkent University Clinical Research Ethics Committee.

Supporting Institution

The authors declares that there is no financial support.

References

  • P. Majaranta and A. Bulling, “Eye Tracking and Eye-Based Human–Computer Interaction BT - Advances in Physiological Computing,” S. H. Fairclough and K. Gilleade, Eds., London: Springer London, 2014, pp. 39–65. doi: 10.1007/978-1-4471-6392-3_3.
  • E. Skodras and N. Fakotakis, “Precise localization of eye centers in low resolution color images,” Image Vis Comput, vol. 36, pp. 51–60, 2015, doi: 10.1016/j.imavis.2015.01.006.
  • A. Al-Rahayfeh and M. Faezipour, “Eye tracking and head movement detection: A state-of-art survey,” IEEE J Transl Eng Health Med, vol. 1, no. May, pp. 11–22, 2013, doi: 10.1109/JTEHM.2013.2289879.
  • D. W. Hansen and Q. Ji, “In the Eye of the Beholder: A Survey of Models for Eyes and Gaze,” IEEE Trans Pattern Anal Mach Intell, vol. 32, no. 3, pp. 478–500, 2010, doi: 10.1109/TPAMI.2009.30.
  • M. Porta, P. Dondi, A. Pianetta, and V. Cantoni, “SPEye: A Calibration-Free Gaze-Driven Text Entry Technique Based on Smooth Pursuit,” IEEE Trans Hum Mach Syst, vol. 52, no. 2, pp. 312–323, 2022, doi: 10.1109/THMS.2021.3123202.
  • M. Mifsud, T. A. Camilleri, and K. P. Camilleri, “HMM-based gesture recognition for eye-swipe typing,” Biomed Signal Process Control, vol. 86, no. PA, p. 105161, 2023, doi: 10.1016/j.bspc.2023.105161.
  • M. Yildiz and M. Yorulmaz, “A Novel Gaze Input System Based on Iris Tracking with Webcam Mounted Eyeglasses,” Interact Comput, vol. 33, no. 2, pp. 211–222, 2021, doi: 10.1093/iwc/iwab022.
  • A. López, J. R. Villar, M. Fernández, and F. J. Ferrero, “Comparison of classification techniques for the control of EOG-based HCIs,” Biomed Signal Process Control, vol. 80, no. August 2022, 2023, doi: 10.1016/j.bspc.2022.104263.
  • M. Yildiz and H. Ö. Ülkütaş, “A New PC-Based Text Entry System Based on EOG Coding,” Advances in Human-Computer Interaction, vol. 2018, 2018, doi: 10.1155/2018/8528176.
  • N. Barbara, T. A. Camilleri, and K. P. Camilleri, “EOG-based eye movement detection and gaze estimation for an asynchronous virtual keyboard,” Biomed Signal Process Control, vol. 47, pp. 159–167, 2019, doi: 10.1016/j.bspc.2018.07.005.
  • S. M. Hosni, H. A. Shedeed, M. S. Mabrouk, and M. F. Tolba, “EEG-EOG based Virtual Keyboard: Toward Hybrid Brain Computer Interface,” Neuroinformatics, vol. 17, no. 3, pp. 323–341, 2019, doi: 10.1007/s12021-018-9402-0.
  • “LabVanced:Features.” Accessed: Jul. 09, 2021. [Online]. Available: https://www.labvanced.com/features.html
  • E. Skodras, V. G. Kanas, and N. Fakotakis, “On visual gaze tracking based on a single low cost camera,” Signal Process Image Commun, vol. 36, pp. 29–42, Aug. 2015, doi: 10.1016/J.IMAGE.2015.05.007.
  • A. B. Usakli and S. Gurkan, “Design of a novel efficient humancomputer interface: An electrooculagram based virtual keyboard,” IEEE Trans Instrum Meas, vol. 59, no. 8, pp. 2099–2108, 2010, doi: 10.1109/TIM.2009.2030923.
  • D. S. Nathan, A. P. Vinod, and K. P. Thomas, “An electrooculogram based assistive communication system with improved speed and accuracy using multi-directional eye movements,” 2012 35th International Conference on Telecommunications and Signal Processing, TSP 2012 - Proceedings, pp. 554–558, 2012, doi: 10.1109/TSP.2012.6256356.
  • S. He and Y. Li, “A single-channel EOG-based speller,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 11, pp. 1978–1987, 2017, doi: 10.1109/TNSRE.2017.2716109.
  • S. D and R. R. M, “A high performance asynchronous EOG speller system,” Biomed Signal Process Control, vol. 59, p. 101898, 2020, doi: 10.1016/j.bspc.2020.101898.
  • A. Huckauf and M. H. Urbina, “Gazing with pEYEs: Towards a universal input for various applications,” in Eye Tracking Research and Applications Symposium (ETRA), 2008, pp. 51–54. doi: 10.1145/1344471.1344483.
  • A. Dissertation, Päivi Majaranta Text Entry by Eye Gaze, no. 11. 2009.
  • P. O. Kristensson and K. Vertanen, “The potential of dwell-free eye-typing for fast assistive gaze communication,” in Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA ’12, 2012, p. 241. doi: 10.1145/2168556.2168605.
  • O. Tuisku, V. Surakka, V. Rantanen, T. Vanhala, and J. Lekkala, “Text entry by gazing and smiling,” Advances in Human-Computer Interaction, vol. 2013, 2013, doi: 10.1155/2013/218084.
  • D. Pedrosa, M. D. G. Pimentel, A. Wright, and K. N. Truong, “Filteryedping: Design challenges and user performance of dwell-free eye typing,” ACM Transactions on Accessible Computing (TACCESS), vol. 6, no. 1, pp. 1–37, 2015.
  • O. H. M. Lutz, A. C. Venjakob, and S. Ruff, “SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements,” J Eye Mov Res, vol. 8, no. 1, pp. 1–11, 2015, doi: 10.16910/jemr.8.1.2.
  • A. Kurauchi, “EyeSwipe : Dwell-free Text Entry Using Gaze Paths,” pp. 1952–1956, 2016.
  • Z. Zeng, X. Wang, F. W. Siebert, and H. Liu, “Enhancing Hybrid Eye Typing Interfaces with Word and Letter Prediction: A Comprehensive Evaluation,” Int J Hum Comput Interact, 2025, doi: 10.1080/10447318.2023.2297113.
  • P. Dondi, S. Sapuppo, and M. Porta, “Leyenes: A gaze-based text entry method using linear smooth pursuit and target speed,” Int J Hum Comput Stud, vol. 184, p. 103204, Apr. 2024, doi: 10.1016/J.IJHCS.2023.103204.
  • M. Barrett, H. Skovsgaard, and J. S. Agustin, “Performance evaluation of a low-cost gaze tracker for eye typing,” Proceedings of the 5th Conference on Communication by Gaze Interaction, pp. 13–17, 2009.
  • C. Zhang, R. Yao, and J. Cai, “Efficient eye typing with 9-direction gaze estimation,” Multimed Tools Appl, vol. 77, no. 15, pp. 19679–19696, 2018, doi: 10.1007/s11042-017-5426-y.
  • R. Karatay, B. Demir, A. A. Ergin, and E. Erkan, “A real-time eye movement-based computer interface for people with disabilities,” Smart Health, vol. 34, p. 100521, Dec. 2024, doi: 10.1016/J.SMHL.2024.100521.
  • V. Emile Tatinyuy, A. V. Noumsi Woguia, J. M. Ngono, and L. A. Fono, “Multi-stage gaze-controlled virtual keyboard using eye tracking,” PLoS One, vol. 19, no. 10, p. e0309832, 2024, doi: 10.1371/JOURNAL.PONE.0309832.
  • G. R. Chhimpa, A. Kumar, S. Garhwal, and Dhiraj, “Empowering individuals with disabilities: a real-time, cost-effective, calibration-free assistive system utilizing eye tracking,” J Real Time Image Process, vol. 21, no. 3, pp. 1–16, May 2024, doi: 10.1007/S11554-024-01478-W/FIGURES/11.
  • Y. Liu, B. L. Deepu, R. Andrzej, and S. Martin, “CamType : assistive text entry using gaze with an off-the-shelf webcam,” Mach Vis Appl, vol. 30, no. 3, pp. 407–421, 2019, doi: 10.1007/s00138-018-00997-4.
  • Y. Durna and F. Arı, “Development and Application of Gaze Point Detection with Polynomial Functions,” Savunma Bilimleri Dergisi, vol. 15, no. 2, pp. 25–45, 2016.
  • K. Lee et al., “Eye-wearable head-mounted tracking and gaze estimation interactive machine system for human – machine interface,” 2021, doi: 10.1177/1461348419875047.
  • D. Li, J. Babcock, and D. J. Parkhurst, “openEyes: a low-cost head-mounted eye-tracking solution,” in ETRA 2006, 2006, pp. 27–29.
  • W. J. Ryan, A. T. Duchowski, and S. T. Birchfield, “Limbus/pupil switching for wearable eye tracking under variable lighting conditions,” in Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA ’08, 2008, p. 61. doi: 10.1145/1344471.1344487.
  • Y. Zhang, A. Bulling, and H. Gellersen, “Discrimination of gaze directions using low-level eye image features,” in Proceedings of the 1st international workshop on Pervasive eye tracking & mobile eye-based interaction - PETMEI ’11, 2011, p. 9. doi: 10.1145/2029956.2029961.
  • B. R. Pires, M. Hwangbo, M. Devyver, and T. Kanade, “Visible-spectrum gaze tracking for sports,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp. 1005–1010, 2013, doi: 10.1109/CVPRW.2013.146.
  • O. Mazhar, T. A. Shah, M. A. Khan, and S. Tehami, “A real-time webcam based Eye Ball Tracking System using MATLAB,” 2015 IEEE 21st International Symposium for Design and Technology in Electronic Packaging, SIITME 2015, pp. 139–142, 2015, doi: 10.1109/SIITME.2015.7342312.
  • E. Sümer, İ. B. Uslu, and M. Türker, “Research Article / Ara ş t ı rma Makalesi AN EYE-CONTROLLED WEARABLE COMMUNICATION AND CONTROL SYSTEM FOR ALS PATIENTS : SMARTEYES,” vol. 8, no. 2, pp. 107–116, 2017.
  • Z. Zhu and Q. Ji, “Novel eye gaze tracking techniques under natural head movement,” in IEEE Transactions on Biomedical Engineering, 2007, pp. 2246–2260. doi: 10.1109/TBME.2007.895750.
  • M. E. Dalkilic and G. Dalkılıç, “On the Cryptographic Patterns and Frequencies in Turkish Language,” in Lecture Notes in Computer Science, 2002. doi: 10.1007/3-540-36077-8.
  • P. Majaranta, Uii. K. Ahola, and O. Špakov, “Fast gaze typing with an adjustable dwell time,” Conference on Human Factors in Computing Systems - Proceedings, no. June, pp. 357–360, 2009, doi: 10.1145/1518701.1518758.
  • I. S. MacKenzie and R. W. Soukoreff, “Phrase sets for evaluating text entry techniques,” p. 754, 2003, doi: 10.1145/765968.765971.
  • R. Rutter, “List of Pangrams,” Clagnut. Accessed: Jan. 01, 2024. [Online]. Available: https://clagnut.com/blog/2380
  • I. S. Mackenzie and K. Tanaka-Ishii, Text Entry Systems Mobility, Accessibility, Universality. Morgan Kaufmann, 2007.
  • A. S. Arif and W. Stuerzlinger, “Analysis of text entry performance metrics,” TIC-STH’09: 2009 IEEE Toronto International Conference - Science and Technology for Humanity, pp. 100–105, 2009, doi: 10.1109/TIC-STH.2009.5444533.
  • M. J. Tsai, H. T. Hou, M. L. Lai, W. Y. Liu, and F. Y. Yang, “Visual attention for solving multiple-choice science problem: An eye-tracking analysis,” Comput Educ, vol. 58, no. 1, pp. 375–385, 2012, doi: 10.1016/j.compedu.2011.07.012.
  • B. Benligiray, C. Topal, and C. Akinlar, “SliceType: fast gaze typing with a merging keyboard,” Journal on Multimodal User Interfaces, vol. 13, no. 4, pp. 321–334, 2019, doi: 10.1007/s12193-018-0285-z.
  • Y. K. Meena, H. Cecotti, K. F. Wong-Lin, and G. Prasad, “Design and evaluation of a time adaptive multimodal virtual keyboard,” Journal on Multimodal User Interfaces, vol. 13, no. 4, pp. 343–361, 2019, doi: 10.1007/s12193-019-00293-z.
  • L. A. Farwell and E. Donchin, “Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials,” Electroencephalogr Clin Neurophysiol, vol. 70, no. 6, pp. 510–523, Dec. 1988, doi: 10.1016/0013-4694(88)90149-6.
  • X. Chen, Y. Wang, M. Nakanishi, X. Gao, T. P. Jung, and S. Gao, “High-speed spelling with a noninvasive brain-computer interface,” Proc Natl Acad Sci U S A, vol. 112, no. 44, pp. E6058–E6067, Nov. 2015, doi: 10.1073/PNAS.1508080112/SUPPL_FILE/PNAS.201508080SI.PDF.
There are 52 citations in total.

Details

Primary Language English
Subjects Biomedical Imaging
Journal Section Research Article
Authors

Seval Uğurlu 0000-0002-8310-5785

Muhammet Yorulmaz 0000-0002-4904-9396

Metin Yıldız 0000-0002-2554-6953

Publication Date September 1, 2025
Submission Date January 13, 2025
Acceptance Date June 30, 2025
Published in Issue Year 2025 Volume: 13 Issue: 3

Cite

IEEE S. Uğurlu, M. Yorulmaz, and M. Yıldız, “NEW TURKISH VIRTUAL KEYBOARD APPLICATION USING SIMPLE CAMERA-MOUNTED EYEGLASSES”, KONJES, vol. 13, no. 3, pp. 910–926, 2025, doi: 10.36306/konjes.1619363.