Research Article
BibTex RIS Cite

ACTIVITY RECOGNITION WITH A HYBRID CNN–NCP AND SELF-ATTENTION-BASED MODEL

Year 2025, Volume: 11 Issue: 2, 290 - 305, 30.12.2025
https://doi.org/10.51477/mejs.1667055

Abstract

In this study, a model based on Hybrid CNN-NCP and Self-Attention is proposed to recognize human activities from sensor data. The proposed model aims to improve the accuracy of human activity recognition (HAR) systems by learning spatial and temporal dependencies more effectively. The CNN layer, one of the main components of the model, extracts low-level spatial patterns from sensor data. These features are then transferred to the NCP layer, where long-term dependencies in time series data are learnt using dynamic feedback mechanisms. In the final stage, the Self-Attention mechanism improves the decision-making process of the model by focusing on critical information in different time periods. In this study, accelerometer and gyroscope sensor data are used to classify 12 different movement activities that are frequently performed in daily life. The performance of the model was evaluated by Group K-Fold cross-validation method and an average accuracy rate of 90.38% was obtained. The proposed hybrid model provides an innovative solution in HAR systems by providing higher performance compared to traditional deep learning approaches.

Ethical Statement

The authors declare that this study does not require ethics committee approval.

References

  • Ó. D. Lara and M. A. Labrador, “A survey on human activity recognition using wearable sensors,” IEEE Communications Surveys and Tutorials, vol. 15, no. 3, pp. 1192–1209, 2013, doi: 10.1109/SURV.2012.110112.00192.
  • I. A. Lawal and S. Bano, “Deep Human Activity Recognition with Localisation of Wearable Sensors,” IEEE Access, vol. 8, pp. 155060–155070, 2020, doi: 10.1109/ACCESS.2020.3017681.
  • J. Wang, Y. Chen, S. Hao, X. Peng, and L. Hu, “Deep learning for sensor-based activity recognition: A survey,” Pattern Recognit Lett, vol. 119, pp. 3–11, Mar. 2019, doi: 10.1016/J.PATREC.2018.02.010.
  • H. F. Nweke, Y. W. Teh, M. A. Al-garadi, and U. R. Alo, “Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges,” Expert Syst Appl, vol. 105, pp. 233–261, Sep. 2018, doi: 10.1016/J.ESWA.2018.03.056.
  • A. Makalesi, C. Özkaya, M. Yasin Esas, T. Eko ve Tek Üniversitesi, F. Bilimleri Enstitüsü, and B. Mühendisliği Bölümü, “Jiroskop ve İvme Ölçer Verileriyle Makine Öğrenmesi Algoritmaları Kullanılarak İnsan Aktivitesi Tanımlama,” Avrupa Bilim ve Teknoloji Dergisi, vol. 28, no. 28, pp. 811–818, Nov. 2021, doi: 10.31590/EJOSAT.1011461.
  • F. Attal, S. Mohammed, M. Dedabrishvili, F. Chamroukhi, L. Oukhellou, and Y. Amirat, “Physical Human Activity Recognition Using Wearable Sensors,” Sensors 2015, Vol. 15, Pages 31314-31338, vol. 15, no. 12, pp. 31314–31338, Dec. 2015, doi: 10.3390/S151229858.
  • M. Çalişan, M. Fatih TALU, A. Makalesi, and R. Article Mücahit ÇALIŞAN, “Comparison of Methods for Determining Activity from Physical Movements,” Politeknik Dergisi, vol. 24, no. 1, pp. 17–23, Mar. 2021, doi: 10.2339/POLITEKNIK.632070.
  • Y. L. Hsu, S. C. Yang, H. C. Chang, and H. C. Lai, “Human Daily and Sport Activity Recognition Using a Wearable Inertial Sensor Network,” IEEE Access, vol. 6, pp. 31715–31728, May 2018, doi: 10.1109/ACCESS.2018.2839766.
  • F. J. Ordóñez, D. Roggen, Y. Liu, W. Xiao, H.-C. Chao, and P. Chu, “Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition,” Sensors 2016, Vol. 16, Page 115, vol. 16, no. 1, p. 115, Jan. 2016, doi: 10.3390/S16010115.
  • F. Zhou, R. Wang, H. Su, and S. Xu, “A Human Activity Recognition Model Based on Wearable Sensor,” Proceedings - 2022 9th International Conference on Digital Home, ICDH 2022, pp. 169–174, 2022, doi: 10.1109/ICDH57206.2022.00033.
  • C. A. Ronao and S. B. Cho, “Human activity recognition with smartphone sensors using deep learning neural networks,” Expert Syst Appl, vol. 59, pp. 235–244, Oct. 2016, doi: 10.1016/J.ESWA.2016.04.032.
  • J. Iskanderov and M. A. Güvensan, “Activity recognition using smartphones and wearable devices: Traditional approaches, New solutions,” Pamukkale University Journal of Engineering Sciences, vol. 25, no. 2, pp. 223–239, 2019, doi: 10.5505/PAJES.2018.84758.
  • “MHEALTH - UCI Machine Learning Repository.” Accessed: Mar. 23, 2025. [Online]. Available: https://archive.ics.uci.edu/dataset/319/mhealth+dataset
  • “Human Activity Recognition Using Smartphones - UCI Machine Learning Repository.” Accessed: Mar. 23, 2025. [Online]. Available: https://archive.ics.uci.edu/dataset/240/human+activity+recognition+using+smartphones
  • S. M. Lee, S. M. Yoon, and H. Cho, “Human activity recognition from accelerometer data using Convolutional Neural Network,” 2017 IEEE International Conference on Big Data and Smart Computing, BigComp 2017, pp. 131–134, Mar. 2017, doi: 10.1109/BIGCOMP.2017.7881728.
  • G. Bin Huang, Q. Y. Zhu, and C. K. Siew, “Extreme learning machine: Theory and applications,” Neurocomputing, vol. 70, no. 1–3, pp. 489–501, Dec. 2006, doi: 10.1016/J.NEUCOM.2005.12.126.
  • W. Zhu et al., “Co-Occurrence Feature Learning for Skeleton Based Action Recognition Using Regularized Deep LSTM Networks,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 30, no. 1, pp. 3697–3703, Mar. 2016, doi: 10.1609/AAAI.V30I1.10451.
  • J. R. Kwapisz, G. M. Weiss, and S. A. Moore, “Activity recognition using cell phone accelerometers,” ACM SIGKDD Explorations Newsletter, vol. 12, no. 2, pp. 74–82, Mar. 2011, doi: 10.1145/1964897.1964918.
  • A. Anjum and M. U. Ilyas, “Activity recognition using smartphone sensors,” 2013 IEEE 10th Consumer Communications and Networking Conference, CCNC 2013, pp. 914–919, 2013, doi: 10.1109/CCNC.2013.6488584.
  • D. N. Tran and D. D. Phan, “Human Activities Recognition in Android Smartphone Using Support Vector Machine,” Proceedings - International Conference on Intelligent Systems, Modelling and Simulation, ISMS, vol. 0, pp. 64–68, Jul. 2016, doi: 10.1109/ISMS.2016.51.
  • R. A. Voicu, C. Dobre, L. Bajenaru, and R. I. Ciobanu, “Human Physical Activity Recognition Using Smartphone Sensors,” Sensors 2019, Vol. 19, Page 458, vol. 19, no. 3, p. 458, Jan. 2019, doi: 10.3390/S19030458.
  • Y. Tian and W. Chen, “MEMS-based human activity recognition using smartphone,” Chinese Control Conference, CCC, vol. 2016-August, pp. 3984–3989, Aug. 2016, doi: 10.1109/CHICC.2016.7553975.
  • S. Wan, L. Qi, X. Xu, C. Tong, and Z. Gu, “Deep Learning Models for Real-time Human Activity Recognition with Smartphones,” Mobile Networks and Applications, vol. 25, no. 2, pp. 743–755, Apr. 2020, doi: 10.1007/S11036-019-01445-X/FIGURES/7.
  • Y. Chen and C. Shen, “Performance Analysis of Smartphone-Sensor Behavior for Human Activity Recognition,” IEEE Access, vol. 5, pp. 3095–3110, 2017, doi: 10.1109/ACCESS.2017.2676168.
  • N. Ahmed, J. I. Rafiq, and M. R. Islam, “Enhanced Human Activity Recognition Based on Smartphone Sensor Data Using Hybrid Feature Selection Model,” Sensors 2020, Vol. 20, Page 317, vol. 20, no. 1, p. 317, Jan. 2020, doi: 10.3390/S20010317.
  • E. Bulbul, A. Cetin, and I. A. Dogru, “Human Activity Recognition Using Smartphones,” ISMSIT 2018 - 2nd International Symposium on Multidisciplinary Studies and Innovative Technologies, Proceedings, Dec. 2018, doi: 10.1109/ISMSIT.2018.8567275.
  • M. Lechner, R. Hasani, A. Amini, T. A. Henzinger, D. Rus, and R. Grosu, “Neural circuit policies enabling auditable autonomy,” Nature Machine Intelligence 2020 2:10, vol. 2, no. 10, pp. 642–652, Oct. 2020, doi: 10.1038/s42256-020-00237-3.
  • G. Yan et al., “Network control principles predict neuron function in the Caenorhabditis elegans connectome,” Nature 2017 550:7677, vol. 550, no. 7677, pp. 519–523, Oct. 2017, doi: 10.1038/nature24056.
  • R. Hasani, M. Lechner, A. Amini, D. Rus, and R. Grosu, “Liquid Time-constant Networks,” Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 9, pp. 7657–7666, May 2021, doi: 10.1609/AAAI.V35I9.16936.
  • “Mobile Health Human Behavior Analysis.” Accessed: Mar. 23, 2025. [Online]. Available: https://www.kaggle.com/datasets/gaurav2022/mobile-health/data
  • W. Jiang and Z. Yin, “Human activity recognition using wearable sensors by deep convolutional neural networks,” MM 2015 - Proceedings of the 2015 ACM Multimedia Conference, pp. 1307–1310, Oct. 2015, doi: 10.1145/2733373.2806333.
  • M. Sokolova and G. Lapalme, “A systematic analysis of performance measures for classification tasks,” Inf Process Manag, vol. 45, no. 4, pp. 427–437, Jul. 2009, doi: 10.1016/J.IPM.2009.03.002.
  • B. Y. Kazangirler and E. Özkaynak, “Conventional Machine Learning and Ensemble Learning Techniques in Cardiovascular Disease Prediction and Analysis,” Journal of Intelligent Systems: Theory and Applications, vol. 7, no. 2, pp. 81–94, Sep. 2024, doi: 10.38016/jista.1439504.
  • S. P. Singh, M. K. Sharma, A. Lay-Ekuakille, D. Gangwar and S. Gupta, "Deep ConvLSTM With Self-Attention for Human Activity Decoding Using Wearable Sensors," IEEE Sensors Journal, vol. 21, no. 6, pp. 8575-8582, 15 March15, 2021, doi: 10.1109/JSEN.2020.3045135.
  • M. A. Khatun, M. Alquhayz, M. S. Uddin, M. Rubaiee, and R. Alroobaea, “Deep CNN-LSTM With Self-Attention Model for Human Activity Recognition Using Wearable Sensor,” IEEE Journal of Translational Engineering in Health and Medicine, vol. 10, pp. 1–11, 2022, Art. no. 2700316, doi: 10.1109/JTEHM.2022.3177710.
There are 35 citations in total.

Details

Primary Language English
Subjects Electrical Engineering (Other)
Journal Section Research Article
Authors

Nevzat Olgun 0000-0003-2461-4923

Mücahit Çalışan 0000-0003-2651-5937

Caner Balım 0000-0002-1010-129X

Submission Date March 27, 2025
Acceptance Date September 19, 2025
Publication Date December 30, 2025
Published in Issue Year 2025 Volume: 11 Issue: 2

Cite

IEEE N. Olgun, M. Çalışan, and C. Balım, “ACTIVITY RECOGNITION WITH A HYBRID CNN–NCP AND SELF-ATTENTION-BASED MODEL”, MEJS, vol. 11, no. 2, pp. 290–305, 2025, doi: 10.51477/mejs.1667055.

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

TRDizinlogo_live-e1586763957746.png   ici2.png     scholar_logo_64dp.png    CenterLogo.png     crossref-logo-landscape-200.png  logo.png         logo1.jpg   DRJI_Logo.jpg  17826265674769  logo.png