Araştırma Makalesi
BibTex RIS Kaynak Göster

The Identification of Individualized Eye Tracking Metrics in VR Using Data Driven Iterative- Adaptive Algorithm

Yıl 2023, , 8 - 21, 07.03.2023
https://doi.org/10.5824/ajite.2023.01.001.x

Öz

Eye tracking metrics provide information about cognitive function and basic oculomotor characteristics. There have been many studies analyzing eye tracking signals using different algorithms. However, these algorithms generally are based on the initial setting parameter. This might cause the subjective interpretation of eye tracking analysis. The main aim of this study was to develop a data-driven algorithm to detect fixations and saccades without any subjective settings. Three subjects were included in this study. Eye tracking signal was acquired with the VIVE Pro Eye in virtual reality (VR) environment while subjects were reading a paragraph. The algorithms based on the calculation of threshold were employed to calculate eye metrics including total fixation duration, total fixation number, total saccades number and average pupil diameter. The proposed algorithm, which is based on calculating the initial threshold, based on mean, and standard deviation of eye tracking signal within experiment duration, gave the same results obtained adaptive filtering reported in literature (average fixation duration for three subjects= 11515 ms ± 6951.2, average fixation count for three subjects= 17.33 ± 4.16). On the other hand, our proposed algorithm didn’t use any certain objective parameter as like adaptive filtering. As a conclusion, VIVE Pro Eye may be utilized as an eye movement assessment device, and, the suggested approach might be utilized to analyze objective eye tracking metrics.

Destekleyen Kurum

This work was supported by Neo Auvra® Digital Health and Bionic Technologies and Services Inc.

Kaynakça

  • B. Poletti et al., “An eye-tracker controlled cognitive battery: overcoming verbal-motor limitations in ALS,” J Neurol, vol. 264, no. 6, pp. 1136-1145, Jun 2017, doi: 10.1007/s00415-017-8506-z.
  • N. Noiret et al., “Saccadic Eye Movements and Attentional Control in Alzheimer's Disease,” Arch Clin Neuropsychol, vol. 33, no. 1, pp. 1-13, Feb 1 2018, doi: 10.1093/arclin/acx044.
  • I. M. Pavisic et al., “Eyetracking Metrics in Young Onset Alzheimer's Disease: A Window into Cognitive Visual Functions,” Frontiers in neurology, vol. 8, pp. 377-377, 2017, doi: 10.3389/fneur.2017.00377.
  • C. de Boer, J. van der Steen, F. Mattace-Raso, A. J. Boon, and J. J. Pel, “The Effect of Neurodegeneration on Visuomotor Behavior in Alzheimer's Disease and Parkinson's Disease,” Motor Control, vol. 20, no. 1, pp. 1-20, Jan 2016.
  • J. Fielding, T. Kilpatrick, L. Millist, and O. White, “Multiple sclerosis: Cognition and saccadic eye movements,” Journal of the neurological sciences, vol. 277, no. 1-2, pp. 32-6, Feb 2009.
  • J. Lunn, T. Donovan, D. Litchfield, C. Lewis, R. Davies, and T. Crawford, “Saccadic Eye Movement Abnormalities in Children with Epilepsy,” PloS one, vol. 11, no. 8, pp. e0160508-e0160508, Aug 2016.
  • J. Beatty, “Task-evoked pupillary responses, processing load, and the structure of processing resources,” Psychological bulletin, vol. 91, no. 2, pp. 276-92, Mar 1982.
  • L. R. Young and D. Sheena, “Survey of eye movement recording methods,” Behavior Research Methods & Instrumentation, vol. 7, no. 5, pp. 397-429, Sep 1975.
  • W.-D. Chang, “Electrooculograms for Human-Computer Interaction: A Review,” Sensors (Basel), vol. 19, no. 12, Jun 14 2019.
  • W. D. Chang, “Electrooculograms for Human-Computer Interaction: A Review,” Sensors (Basel), vol. 19, no. 12, Jun 14 2019, doi: 10.3390/s19122690.
  • M. Ma and H. Zheng, "Virtual Reality and Serious Games in Healthcare," in Advanced computational intelligence paradigms in healthcare 6. Virtual reality in psychotherapy, rehabilitation, and assessment, vol. 337: Springer, 2011, pp. 169-192.
  • S. Munn, L. Stefano, and J. Pelz, "Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding," in Proceedings of the 5th symposium on Applied perception in graphics and visualization, 2008, pp. 33-42.
  • D. Salvucci and J. Goldberg, "Identifying fixations and saccades in eye-tracking protocols," in Proceedings of the 2000 symposium on Eye tracking research & applications, 2000, pp. 71-78.
  • D. Salvucci and J. Goldberg, Identifying fixations and saccades in eye-tracking protocols. 2000, pp. 71-78.
  • M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behavior Research Methods, vol. 42, no. 1, pp. 188-204, 2010/02/01 2010, doi: 10.3758/BRM.42.1.188.
  • Y. Imaoka, A. Flury, and E. D. de Bruin, “Assessing Saccadic Eye Movements With Head-Mounted Display Virtual Reality Technology,” Frontiers in Psychiatry, vol. 11, no. 922, p. 572938, September 2020.
  • S. Woltering, N. Bazargani, and Z.-X. Liu, “Eye blink correction: a test on the preservation of common ERP components using a regression based technique,” PeerJ, vol. 1, pp. e76-e76, May 2013.
  • N. Galley, D. Betz, and C. Biniossek, "Fixation durations - Why are they so highly variable?," D. E. v. R. C. Z. L. a. h. d. Rational-Choice-Theorie Ed., 2015, pp. 83-106.
  • H.-Y. Kim, “Statistical notes for clinical researchers: assessing normal distribution (2) using skewness and kurtosis,” Restorative dentistry & endodontics, vol. 38, no. 1, pp. 52-54, Feb 2013.
  • L. Ferman, H. Collewijn, T. C. Jansen, and A. V. Van den Berg, “Human gaze stability in the horizontal, vertical and torsional direction during voluntary head movements, evaluated with a three-dimensional scleral induction coil technique,” Vision research, vol. 27, no. 5, pp. 811-28, 1987.
  • A. A. Skavenski, R. M. Hansen, R. M. Steinman, and B. J. Winterson, “Quality of retinal image stabilization during small natural and artificial body rotations in man,” Vision research, vol. 19, no. 6, pp. 675-83, 1979.

Veriye Dayalı Yinelemeli-Uyarlamalı Algoritma Kullanılarak VR’da Göz İzleme Metriklerinin Tanımlanması

Yıl 2023, , 8 - 21, 07.03.2023
https://doi.org/10.5824/ajite.2023.01.001.x

Öz

Göz izleme ölçümleri, bilişsel işlev ve temel okülomotor özellikler hakkında bilgi sağlar. Farklı algoritmalar kullanarak göz izleme sinyallerini analiz eden birçok çalışma yapılmıştır. Ancak bu algoritmalar genellikle ilk ayar parametresine dayalıdır. Bu, göz izleme analizinin öznel yorumuna neden olabilir. Bu çalışmanın temel amacı, herhangi bir öznel ayar olmaksızın fiksasyonları ve sakkadları tespit etmek için veriye dayalı bir algoritma geliştirmekti. Bu çalışmaya üç konu dahil edilmiştir. Denekler VR'de bir paragraf okurken, sanal gerceklik (VR) ortamında VIVE Pro Eye ile göz izleme sinyali alındı. Toplam sabitleme süresi, toplam sabitleme sayısı, toplam sakkad sayısı ve ortalama göz bebeği çapı dahil olmak üzere göz ölçümlerini hesaplamak için eşiğin hesaplanmasına dayalı algoritmalar kullanıldı. Deney süresi içinde göz izleme sinyalinin ortalama ve standart sapmasına bağlı olarak başlangıç eşiğini hesaplamaya dayanan önerilen algoritma, literatürde bildirilen uyarlamalı filtreleme ile elde edilen sonuçların aynısını vermiştir (üç denek için ortalama fiksasyon süresi= 11515 ms ± 6951.2, üç denek için ortalama fiksasyon sayısı= 17.33 ± 4.16). Öte yandan, önerilen algoritmamız uyarlamalı filtreleme gibi belirli bir nesnel parametre kullanmamıştır. Sonuç olarak, VIVE Pro Eye bir göz hareketi değerlendirme cihazı olarak kullanılabilir ve önerilen yaklaşım objektif göz izleme metriklerini analiz etmek için kullanılabilir.

Kaynakça

  • B. Poletti et al., “An eye-tracker controlled cognitive battery: overcoming verbal-motor limitations in ALS,” J Neurol, vol. 264, no. 6, pp. 1136-1145, Jun 2017, doi: 10.1007/s00415-017-8506-z.
  • N. Noiret et al., “Saccadic Eye Movements and Attentional Control in Alzheimer's Disease,” Arch Clin Neuropsychol, vol. 33, no. 1, pp. 1-13, Feb 1 2018, doi: 10.1093/arclin/acx044.
  • I. M. Pavisic et al., “Eyetracking Metrics in Young Onset Alzheimer's Disease: A Window into Cognitive Visual Functions,” Frontiers in neurology, vol. 8, pp. 377-377, 2017, doi: 10.3389/fneur.2017.00377.
  • C. de Boer, J. van der Steen, F. Mattace-Raso, A. J. Boon, and J. J. Pel, “The Effect of Neurodegeneration on Visuomotor Behavior in Alzheimer's Disease and Parkinson's Disease,” Motor Control, vol. 20, no. 1, pp. 1-20, Jan 2016.
  • J. Fielding, T. Kilpatrick, L. Millist, and O. White, “Multiple sclerosis: Cognition and saccadic eye movements,” Journal of the neurological sciences, vol. 277, no. 1-2, pp. 32-6, Feb 2009.
  • J. Lunn, T. Donovan, D. Litchfield, C. Lewis, R. Davies, and T. Crawford, “Saccadic Eye Movement Abnormalities in Children with Epilepsy,” PloS one, vol. 11, no. 8, pp. e0160508-e0160508, Aug 2016.
  • J. Beatty, “Task-evoked pupillary responses, processing load, and the structure of processing resources,” Psychological bulletin, vol. 91, no. 2, pp. 276-92, Mar 1982.
  • L. R. Young and D. Sheena, “Survey of eye movement recording methods,” Behavior Research Methods & Instrumentation, vol. 7, no. 5, pp. 397-429, Sep 1975.
  • W.-D. Chang, “Electrooculograms for Human-Computer Interaction: A Review,” Sensors (Basel), vol. 19, no. 12, Jun 14 2019.
  • W. D. Chang, “Electrooculograms for Human-Computer Interaction: A Review,” Sensors (Basel), vol. 19, no. 12, Jun 14 2019, doi: 10.3390/s19122690.
  • M. Ma and H. Zheng, "Virtual Reality and Serious Games in Healthcare," in Advanced computational intelligence paradigms in healthcare 6. Virtual reality in psychotherapy, rehabilitation, and assessment, vol. 337: Springer, 2011, pp. 169-192.
  • S. Munn, L. Stefano, and J. Pelz, "Fixation-identification in dynamic scenes: Comparing an automated algorithm to manual coding," in Proceedings of the 5th symposium on Applied perception in graphics and visualization, 2008, pp. 33-42.
  • D. Salvucci and J. Goldberg, "Identifying fixations and saccades in eye-tracking protocols," in Proceedings of the 2000 symposium on Eye tracking research & applications, 2000, pp. 71-78.
  • D. Salvucci and J. Goldberg, Identifying fixations and saccades in eye-tracking protocols. 2000, pp. 71-78.
  • M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behavior Research Methods, vol. 42, no. 1, pp. 188-204, 2010/02/01 2010, doi: 10.3758/BRM.42.1.188.
  • Y. Imaoka, A. Flury, and E. D. de Bruin, “Assessing Saccadic Eye Movements With Head-Mounted Display Virtual Reality Technology,” Frontiers in Psychiatry, vol. 11, no. 922, p. 572938, September 2020.
  • S. Woltering, N. Bazargani, and Z.-X. Liu, “Eye blink correction: a test on the preservation of common ERP components using a regression based technique,” PeerJ, vol. 1, pp. e76-e76, May 2013.
  • N. Galley, D. Betz, and C. Biniossek, "Fixation durations - Why are they so highly variable?," D. E. v. R. C. Z. L. a. h. d. Rational-Choice-Theorie Ed., 2015, pp. 83-106.
  • H.-Y. Kim, “Statistical notes for clinical researchers: assessing normal distribution (2) using skewness and kurtosis,” Restorative dentistry & endodontics, vol. 38, no. 1, pp. 52-54, Feb 2013.
  • L. Ferman, H. Collewijn, T. C. Jansen, and A. V. Van den Berg, “Human gaze stability in the horizontal, vertical and torsional direction during voluntary head movements, evaluated with a three-dimensional scleral induction coil technique,” Vision research, vol. 27, no. 5, pp. 811-28, 1987.
  • A. A. Skavenski, R. M. Hansen, R. M. Steinman, and B. J. Winterson, “Quality of retinal image stabilization during small natural and artificial body rotations in man,” Vision research, vol. 19, no. 6, pp. 675-83, 1979.
Toplam 21 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Bölüm Araştırma Makaleleri
Yazarlar

Dilek Betul Arslan 0000-0002-1124-3695

Murat Sükuti 0000-0003-4380-1791

Adil Deniz Duru 0000-0003-3014-9626

Yayımlanma Tarihi 7 Mart 2023
Gönderilme Tarihi 18 Eylül 2022
Yayımlandığı Sayı Yıl 2023

Kaynak Göster

APA Arslan, D. B., Sükuti, M., & Duru, A. D. (2023). The Identification of Individualized Eye Tracking Metrics in VR Using Data Driven Iterative- Adaptive Algorithm. AJIT-E: Academic Journal of Information Technology, 14(52), 8-21. https://doi.org/10.5824/ajite.2023.01.001.x