Araştırma Makalesi

The effects of sensor and feature level fusion methods in multimodal emotion analysis

Cilt: 13 Sayı: 4 15 Ekim 2024
PDF İndir
TR EN

The effects of sensor and feature level fusion methods in multimodal emotion analysis

Abstract

Fusion-based studies on multimodal emotion recognition (MER) are very popular nowadays. In this study, EEG signals and facial images are fused using Sensor Level Fusion (SLF) and Feature Level Fusion (FLF) methods for multimodal emotion recognition. The general procedure of the study is as follows. First, the EEG signals are converted into angle amplitude graph (AAG) images. Second, the most unique ones are automatically identified from all face images obtained from video recordings. Then, these modalities are fused separately using SLF and FLF methods. The fusion approaches were used to combine the obtained data and perform classification on the integrated data. The experiments were performed on the publicly available DEAP dataset. The highest accuracy was 82.14% with 5.26 standard deviations for SLF and 87.62% with 6.74 standard deviations for FLF. These results show that this study makes an important contribution to the field of emotion recognition by providing an effective method.

Keywords

Proje Numarası

121E002

Teşekkür

This research was supported by the Turkish Scientific and Research Council (TUBITAK) through project 121E002 and 119E397.

Kaynakça

  1. A. F. M. N. H. Nahin, J. M. Alam, H. Mahmud and K. Hasan, Identifying emotion by keystroke dynamics and text pattern analysis. Behaviour & Information Technology, 33(9), 987–996, 2014. https://doi.org/10.1080/0144929X.2014.907343.
  2. A. Sapra, N. Panwar, and S. Panwar, Emotion recognition from speech. International journal of emerging technology and advanced engineering, 3(2), 341-345, 2013.
  3. G. K. Verma, and U. S. Tiwary, Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals. NeuroImage, 102, 162-172, 2014. https://doi.org/10.1016/j.neuroimage.2013.11.007.
  4. S. Luo, Y. T. Lan, D. Peng, Z. Li, W. L. Zheng, and B. L. Lu, Multimodal Emotion Recognition in Response to Oil Paintings. 44th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 4167-4170, 2022. https://doi.org/10.1109/EMBC48229.2022.9871630.
  5. J. N. Njoku, A. C. Caliwag, W. Lim, S. Kim, H. Hwang, and J. Jung, Deep learning-based data fusion methods for multimodal emotion recognition. The Journal of Korean Institute of Communications and Information Sciences, 47(1), 79-87, 2022. https://doi.org/10.1109/ 10.7840/kics.2022.47.1.79.
  6. J. Pan, W. Fang, Z. Zhang, B. Chen, Z. Zhang, and S. Wang, Multimodal Emotion Recognition based on Facial Expressions, Speech, and EEG. IEEE Open Journal of Engineering in Medicine and Biology, 2023. https://doi.org/10.1109/10.7840/10.1109/OJEMB.2023.3240280.
  7. R. Li, Y. Liang, X. Liu, B. Wang, W. Huang, Z. Cai, and J. Pan, MindLink-eumpy: an open-source python toolbox for multimodal emotion recognition. Frontiers in Human Neuroscience, 15, 621493, 2021. https://doi.org/10.3389/fnhum.2021.621493.
  8. Y. Zhao and D. Chen, Expression eeg multimodal emotion recognition method based on the bidirectional lstm and attention mechanism. Computational and Mathematical Methods in Medicine, 1-12, 2021. https://doi.org/10.1155/2021/9967592.

Ayrıntılar

Birincil Dil

İngilizce

Konular

Multimodal Analiz ve Sentez , Örüntü Tanıma , Makine Öğrenme (Diğer) , Bilgi Çıkarma ve Füzyon

Bölüm

Araştırma Makalesi

Erken Görünüm Tarihi

2 Eylül 2024

Yayımlanma Tarihi

15 Ekim 2024

Gönderilme Tarihi

16 Nisan 2024

Kabul Tarihi

25 Haziran 2024

Yayımlandığı Sayı

Yıl 2024 Cilt: 13 Sayı: 4

Kaynak Göster

APA
Hatipoğlu Yılmqz, B., & Köse, C. (2024). The effects of sensor and feature level fusion methods in multimodal emotion analysis. Niğde Ömer Halisdemir Üniversitesi Mühendislik Bilimleri Dergisi, 13(4), 1093-1099. https://doi.org/10.28948/ngumuh.1469289
AMA
1.Hatipoğlu Yılmqz B, Köse C. The effects of sensor and feature level fusion methods in multimodal emotion analysis. NÖHÜ Müh. Bilim. Derg. 2024;13(4):1093-1099. doi:10.28948/ngumuh.1469289
Chicago
Hatipoğlu Yılmqz, Bahar, ve Cemal Köse. 2024. “The effects of sensor and feature level fusion methods in multimodal emotion analysis”. Niğde Ömer Halisdemir Üniversitesi Mühendislik Bilimleri Dergisi 13 (4): 1093-99. https://doi.org/10.28948/ngumuh.1469289.
EndNote
Hatipoğlu Yılmqz B, Köse C (01 Ekim 2024) The effects of sensor and feature level fusion methods in multimodal emotion analysis. Niğde Ömer Halisdemir Üniversitesi Mühendislik Bilimleri Dergisi 13 4 1093–1099.
IEEE
[1]B. Hatipoğlu Yılmqz ve C. Köse, “The effects of sensor and feature level fusion methods in multimodal emotion analysis”, NÖHÜ Müh. Bilim. Derg., c. 13, sy 4, ss. 1093–1099, Eki. 2024, doi: 10.28948/ngumuh.1469289.
ISNAD
Hatipoğlu Yılmqz, Bahar - Köse, Cemal. “The effects of sensor and feature level fusion methods in multimodal emotion analysis”. Niğde Ömer Halisdemir Üniversitesi Mühendislik Bilimleri Dergisi 13/4 (01 Ekim 2024): 1093-1099. https://doi.org/10.28948/ngumuh.1469289.
JAMA
1.Hatipoğlu Yılmqz B, Köse C. The effects of sensor and feature level fusion methods in multimodal emotion analysis. NÖHÜ Müh. Bilim. Derg. 2024;13:1093–1099.
MLA
Hatipoğlu Yılmqz, Bahar, ve Cemal Köse. “The effects of sensor and feature level fusion methods in multimodal emotion analysis”. Niğde Ömer Halisdemir Üniversitesi Mühendislik Bilimleri Dergisi, c. 13, sy 4, Ekim 2024, ss. 1093-9, doi:10.28948/ngumuh.1469289.
Vancouver
1.Bahar Hatipoğlu Yılmqz, Cemal Köse. The effects of sensor and feature level fusion methods in multimodal emotion analysis. NÖHÜ Müh. Bilim. Derg. 01 Ekim 2024;13(4):1093-9. doi:10.28948/ngumuh.1469289