Araştırma Makalesi
BibTex RIS Kaynak Göster

Derin Öğrenme Tabanlı Gerçek Zamanlı Vücut Hareketlerinden Duygu Analizi Modeli

Yıl 2024, Cilt: 12 Sayı: 2, 664 - 674, 29.06.2024
https://doi.org/10.29109/gujsc.1384031

Öz

İletişim, canlılar arasında bilgi, duygu ve düşüncelerin aktarılması için kullanılan bir araçtır. Araştırmalar, insan iletişiminin yaklaşık %80'inin beden dili veya mimiklerle yapıldığını göstermektedir. Bir kişinin beden dilini yorumlamak ise o kişinin duygularından çıkarım yapılmasını sağlayabilir. Bu çalışmanın amacı, gerçek zamanlı olarak insan vücut hareketlerini derin öğrenme teknikleri ile analiz ederek bireyin duygusal durumlarını tahmin eden yeni bir makine öğrenmesi modeli geliştirmektir. İnsanların vücudunu tanıyan ve tanınan vücut pozuna iskelet modelin oturtulması işlemi ile o pozdan veri toplayan model geliştirilmiş, toplanan veri seti ile farklı modeller eğitilmiştir. Bununla birlikte gerçek zamanlı analiz yapılması için bir tahmin modeli geliştirilmiştir. Bu sayede insanların beden dili ve mimiklerinden duygu çıkarımı yapılabilmektedir. Yapılan farklı test modellerinin sonuçlarının doğruluk oranının en düşük %92.86, en yüksek %97.80 olduğu ve gerçek zamanlı analiz sonuçlarının doğruluk oranının %94.50 olduğu görülmüştür. Bu çalışmanın sonuçları daha iyi insan-makine etkileşimi, duygusal zekâ uygulamaları ve sosyal etkileşim alanlarında kullanılabilir.

Kaynakça

  • [1] Gunawan, T. S., Ashraf, A., Riza, B. S., Haryanto, E. V., Rosnelly, R., Kartiwi, M., & Janin, Z., Development of video-based emotion recognition using deep learning with Google Colab. TELKOMNIKA (Telecommunication Computing Electronics and Control), 18 (5), 2463-2471, 2020.
  • [2] Ahmed, F., Bari, A. H., & Gavrilova, M. L., Emotion recognition from body movement. IEEE Access, 8, 11761-11781, 2019.
  • [3] Chowdary, M. K., Nguyen, T. N., & Hemanth, D. J., Deep learning-based facial emotion recognition for human–computer interaction applications. Neural Computing and Applications, 35(32), 23311-23328, 2023.
  • [4] Balti, A., Khelifa, M. M. B., Hassine, S. B., Ouazaa, H. A., Abid, S., Lakhoua, M. N., & Sayadi, M., Gait Analysis and Detection of Human Pose Diseases. In 2022 8th International Conference on Control, Decision and Information Technologies (CoDIT), 1 (IEEE), 1381-1386), 2022.
  • [5] Park, S., Yong Chang, J., Jeong, H., Lee, J. H., & Park, J. Y., Accurate and efficient 3d human pose estimation algorithm using single depth images for pose analysis in golf. In Proceedings of the IEEE conference on computer vision and pattern recognition workshops, 49-57, 2017.
  • [6] Wang, J., Tan, S., Zhen, X., Xu, S., Zheng, F., He, Z., & Shao, L., Deep 3D human pose estimation: A review. Computer Vision and Image Understanding, 210, 103225, 2021.
  • [7] Ota, M., Tateuchi, H., Hashiguchi, T., Kato, T., Ogino, Y., Yamagata, M., & Ichihashi, N., Verification of reliability and validity of motion analysis systems during bilateral squat using human pose tracking algorithm. Gait & posture, 80, 62-67. 2020.
  • [8] Si, L., & Liu, B., Multifeature Fusion Human Pose Tracking Algorithm Based on Motion Image Analysis. Wireless Communications and Mobile Computing, 2022.
  • [9] Verma, A., Meenpal, T., & Acharya, B., Human body pose distance image analysis for action recognition. International Journal of Pattern Recognition and Artificial Intelligence, 36 (07), 2255012. 2022.
  • [10] Hyppönen, J., Hakala, A., Annala, K., Zhang, H., Peltola, J., Mervaala, E., & Kälviäinen, R., Automatic assessment of the myoclonus severity from videos recorded according to standardized Unified Myoclonus Rating Scale protocol and using human pose and body movement analysis. Seizure, 76, 72-78, 2020.
  • [11] Xu, J., Yu, Z., Ni, B., Yang, J., Yang, X., & Zhang, W., Deep kinematics analysis for monocular 3d human pose estimation. In Proceedings of the IEEE/CVF Conference on computer vision and Pattern recognition, 899-908, 2020.
  • [12] Rohan, A., Rabah, M., Hosny, T., & Kim, S. H., Human pose estimation-based real-time gait analysis using convolutional neural network. IEEE Access, 8, 191542-191550, 2020.
  • [13] Paudel, P., Kwon, Y. J., Kim, D. H., & Choi, K. H., Industrial Ergonomics Risk Analysis Based on 3D-Human Pose Estimation. Electronics, 11 (20), 3403, 2022.
  • [14] Stenum, J., Rossi, C., & Roemmich, R. T., Two-dimensional video-based analysis of human gait using pose estimation. PLoS computational biology, 17 (4), e1008935, 2021.
  • [15] Kim, J. W., Choi, J. Y., Ha, E. J., & Choi, J. H., Human pose estimation using mediapipe pose and optimization method based on a humanoid model. Applied Sciences, 13 (4), 2700, 2023.
  • [16] Wang, W., Enescu, V., & Sahli, H., Adaptive real-time emotion recognition from body movements. ACM Transactions on Interactive Intelligent Systems (TiiS), 5 (4), 1-21, 2015.
  • [17] Santhoshkumar, R., & Geetha, M. K., Deep learning approach: emotion recognition from human body movements. Journal of Mechanics of Continua and Mathematical Sciences (JMCMS), 14(3), 182-195, 2019.
  • [18] Samadiani, N., Huang, G., Cai, B., Luo, W., Chi, C. H., Xiang, Y., & He, J., A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors, 19 (8), 1863, 2019.
  • [19] Al Ghaithi, A., A robust approach for shear log predictions using deep learning on big data sets from a carbonate reservoir for integrated reservoir characterization projects. In Second International Meeting for Applied Geoscience & Energy (pp. 1840-1844). Society of Exploration Geophysicists and American Association of Petroleum Geologists, 2022.
  • [20] Zhang, X., Hu, Y., Deng, J., Xu, H., & Wen, H., Feature engineering and artificial intelligence-supported approaches used for electric powertrain fault diagnosis: a review. IEEE Access, 10, 29069-29088, 2022.
  • [21] Mo, Y., Wu, Y., Yang, X., Liu, F., & Liao, Y., Review the state-of-the-art technologies of semantic segmentation based on deep learning. Neurocomputing, 493, 626-646, 2022.
  • [22] Šumak, B., Brdnik, S., & Pušnik, M., Sensors and artificial intelligence methods and algorithms for human–computer intelligent interaction: A systematic mapping study. Sensors, 22 (1), 20, 2021.
  • [23] Yang, C. Y., Chen, P. C., & Huang, W. C., Cross-Domain Transfer of EEG to EEG or ECG Learning for CNN Classification Models. Sensors, 23 (5), 2458, 2023.
  • [24] Tamil Priya, D., & Divya Udayan, J., Transfer learning techniques for emotion classification on visual features of images in the deep learning network. International Journal of Speech Technology, 23, 361-372, 2020.
  • [25] Mukhopadhyay, A., Rajshekar Reddy, G. S., Mukherjee, I., Kumar Gopa, G., Pena-Rios, A., & Biswas, P., Generating Synthetic Data for Deep Learning using VR Digital Twin. In Proceedings of the 2021 5th International Conference on Cloud and Big Data Computing, August, 52-56, 2021.
  • [26] Rudovic, O., Utsumi, Y., Lee, J., Hernandez, J., Ferrer, E. C., Schuller, B., & Picard, R. W., Culturenet: A deep learning approach for engagement intensity estimation from face images of children with autism. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October, 339-346, 2018.
  • [27] Yao, R., Hou, L., Yang, L., Gui, J., & Wu, O., Deep human answer understanding for natural reverse QA. Knowledge-Based Systems, 254, 109625, 2022.
  • [28] Wang, T., Zhao, Y., & Yin, M., Analysis and research on the influence of music on students’ mental health under the background of deep learning. Frontiers in Psychology, 13, 998451, 2022.
  • [29] YOLOv7. Pose vs MediaPipe in Human Pose Estimation. https://learnopencv.com/yolov7-pose-vs-mediapipe-in-human-pose-estimation/. Accessed: Feb. 12, 2024.

Deep Learning Based Emotion Analysis Model from Real-Time Body Movements

Yıl 2024, Cilt: 12 Sayı: 2, 664 - 674, 29.06.2024
https://doi.org/10.29109/gujsc.1384031

Öz

Communication is a tool used to transfer information, feelings and thoughts between living things. Research shows that approximately 80% of human communication is done through body language or gestures. Interpreting a person's body language can enable inferences from that person's emotions. The aim of this study is to develop a new machine learning model that predicts an individual's emotional states by analyzing human body movements in real time with deep learning techniques. A model that recognizes people's bodies and collects data from that pose by fitting a skeleton model to the recognized body pose has been developed, and different models have been trained with the collected data set. In addition, a prediction model has been developed for real-time analysis. In this way, emotions can be inferred from people's body language and facial expressions. It was observed that the accuracy rate of the results of the different test models was 92.86% at the lowest, 97.80% at the highest, and the accuracy rate of the real-time analysis results was 94.50%. The results of this study can be used in the fields of better human-machine interaction, emotional intelligence applications and social interaction.

Kaynakça

  • [1] Gunawan, T. S., Ashraf, A., Riza, B. S., Haryanto, E. V., Rosnelly, R., Kartiwi, M., & Janin, Z., Development of video-based emotion recognition using deep learning with Google Colab. TELKOMNIKA (Telecommunication Computing Electronics and Control), 18 (5), 2463-2471, 2020.
  • [2] Ahmed, F., Bari, A. H., & Gavrilova, M. L., Emotion recognition from body movement. IEEE Access, 8, 11761-11781, 2019.
  • [3] Chowdary, M. K., Nguyen, T. N., & Hemanth, D. J., Deep learning-based facial emotion recognition for human–computer interaction applications. Neural Computing and Applications, 35(32), 23311-23328, 2023.
  • [4] Balti, A., Khelifa, M. M. B., Hassine, S. B., Ouazaa, H. A., Abid, S., Lakhoua, M. N., & Sayadi, M., Gait Analysis and Detection of Human Pose Diseases. In 2022 8th International Conference on Control, Decision and Information Technologies (CoDIT), 1 (IEEE), 1381-1386), 2022.
  • [5] Park, S., Yong Chang, J., Jeong, H., Lee, J. H., & Park, J. Y., Accurate and efficient 3d human pose estimation algorithm using single depth images for pose analysis in golf. In Proceedings of the IEEE conference on computer vision and pattern recognition workshops, 49-57, 2017.
  • [6] Wang, J., Tan, S., Zhen, X., Xu, S., Zheng, F., He, Z., & Shao, L., Deep 3D human pose estimation: A review. Computer Vision and Image Understanding, 210, 103225, 2021.
  • [7] Ota, M., Tateuchi, H., Hashiguchi, T., Kato, T., Ogino, Y., Yamagata, M., & Ichihashi, N., Verification of reliability and validity of motion analysis systems during bilateral squat using human pose tracking algorithm. Gait & posture, 80, 62-67. 2020.
  • [8] Si, L., & Liu, B., Multifeature Fusion Human Pose Tracking Algorithm Based on Motion Image Analysis. Wireless Communications and Mobile Computing, 2022.
  • [9] Verma, A., Meenpal, T., & Acharya, B., Human body pose distance image analysis for action recognition. International Journal of Pattern Recognition and Artificial Intelligence, 36 (07), 2255012. 2022.
  • [10] Hyppönen, J., Hakala, A., Annala, K., Zhang, H., Peltola, J., Mervaala, E., & Kälviäinen, R., Automatic assessment of the myoclonus severity from videos recorded according to standardized Unified Myoclonus Rating Scale protocol and using human pose and body movement analysis. Seizure, 76, 72-78, 2020.
  • [11] Xu, J., Yu, Z., Ni, B., Yang, J., Yang, X., & Zhang, W., Deep kinematics analysis for monocular 3d human pose estimation. In Proceedings of the IEEE/CVF Conference on computer vision and Pattern recognition, 899-908, 2020.
  • [12] Rohan, A., Rabah, M., Hosny, T., & Kim, S. H., Human pose estimation-based real-time gait analysis using convolutional neural network. IEEE Access, 8, 191542-191550, 2020.
  • [13] Paudel, P., Kwon, Y. J., Kim, D. H., & Choi, K. H., Industrial Ergonomics Risk Analysis Based on 3D-Human Pose Estimation. Electronics, 11 (20), 3403, 2022.
  • [14] Stenum, J., Rossi, C., & Roemmich, R. T., Two-dimensional video-based analysis of human gait using pose estimation. PLoS computational biology, 17 (4), e1008935, 2021.
  • [15] Kim, J. W., Choi, J. Y., Ha, E. J., & Choi, J. H., Human pose estimation using mediapipe pose and optimization method based on a humanoid model. Applied Sciences, 13 (4), 2700, 2023.
  • [16] Wang, W., Enescu, V., & Sahli, H., Adaptive real-time emotion recognition from body movements. ACM Transactions on Interactive Intelligent Systems (TiiS), 5 (4), 1-21, 2015.
  • [17] Santhoshkumar, R., & Geetha, M. K., Deep learning approach: emotion recognition from human body movements. Journal of Mechanics of Continua and Mathematical Sciences (JMCMS), 14(3), 182-195, 2019.
  • [18] Samadiani, N., Huang, G., Cai, B., Luo, W., Chi, C. H., Xiang, Y., & He, J., A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors, 19 (8), 1863, 2019.
  • [19] Al Ghaithi, A., A robust approach for shear log predictions using deep learning on big data sets from a carbonate reservoir for integrated reservoir characterization projects. In Second International Meeting for Applied Geoscience & Energy (pp. 1840-1844). Society of Exploration Geophysicists and American Association of Petroleum Geologists, 2022.
  • [20] Zhang, X., Hu, Y., Deng, J., Xu, H., & Wen, H., Feature engineering and artificial intelligence-supported approaches used for electric powertrain fault diagnosis: a review. IEEE Access, 10, 29069-29088, 2022.
  • [21] Mo, Y., Wu, Y., Yang, X., Liu, F., & Liao, Y., Review the state-of-the-art technologies of semantic segmentation based on deep learning. Neurocomputing, 493, 626-646, 2022.
  • [22] Šumak, B., Brdnik, S., & Pušnik, M., Sensors and artificial intelligence methods and algorithms for human–computer intelligent interaction: A systematic mapping study. Sensors, 22 (1), 20, 2021.
  • [23] Yang, C. Y., Chen, P. C., & Huang, W. C., Cross-Domain Transfer of EEG to EEG or ECG Learning for CNN Classification Models. Sensors, 23 (5), 2458, 2023.
  • [24] Tamil Priya, D., & Divya Udayan, J., Transfer learning techniques for emotion classification on visual features of images in the deep learning network. International Journal of Speech Technology, 23, 361-372, 2020.
  • [25] Mukhopadhyay, A., Rajshekar Reddy, G. S., Mukherjee, I., Kumar Gopa, G., Pena-Rios, A., & Biswas, P., Generating Synthetic Data for Deep Learning using VR Digital Twin. In Proceedings of the 2021 5th International Conference on Cloud and Big Data Computing, August, 52-56, 2021.
  • [26] Rudovic, O., Utsumi, Y., Lee, J., Hernandez, J., Ferrer, E. C., Schuller, B., & Picard, R. W., Culturenet: A deep learning approach for engagement intensity estimation from face images of children with autism. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October, 339-346, 2018.
  • [27] Yao, R., Hou, L., Yang, L., Gui, J., & Wu, O., Deep human answer understanding for natural reverse QA. Knowledge-Based Systems, 254, 109625, 2022.
  • [28] Wang, T., Zhao, Y., & Yin, M., Analysis and research on the influence of music on students’ mental health under the background of deep learning. Frontiers in Psychology, 13, 998451, 2022.
  • [29] YOLOv7. Pose vs MediaPipe in Human Pose Estimation. https://learnopencv.com/yolov7-pose-vs-mediapipe-in-human-pose-estimation/. Accessed: Feb. 12, 2024.
Toplam 29 adet kaynakça vardır.

Ayrıntılar

Birincil Dil Türkçe
Konular Bilgi Sistemleri (Diğer)
Bölüm Tasarım ve Teknoloji
Yazarlar

Mustafa Yağcı 0000-0003-2911-3909

Muhammed Erkam Aygül 0009-0008-9473-5417

Erken Görünüm Tarihi 13 Haziran 2024
Yayımlanma Tarihi 29 Haziran 2024
Gönderilme Tarihi 1 Kasım 2023
Kabul Tarihi 4 Nisan 2024
Yayımlandığı Sayı Yıl 2024 Cilt: 12 Sayı: 2

Kaynak Göster

APA Yağcı, M., & Aygül, M. E. (2024). Derin Öğrenme Tabanlı Gerçek Zamanlı Vücut Hareketlerinden Duygu Analizi Modeli. Gazi University Journal of Science Part C: Design and Technology, 12(2), 664-674. https://doi.org/10.29109/gujsc.1384031

                                     16168      16167     16166     21432        logo.png


    e-ISSN:2147-9526