Research Article
BibTex RIS Cite

From Textiles to Insights: Evaluating Deep Learning Models on Wearable Sensor Data for Activity Recognition

Year 2025, Volume: 35 Issue: 3, 169 - 185

Abstract

Textile-integrated wearable sensors have emerged as transformative tools for unobtrusive, continuous monitoring of human movement and physiology. These fabrics, embedded with conductive and functional materials, capture rich, multidimensional data streams during naturally occurring activities. In this study, we comprehensively compare the performance of five deep learning methodologies—LSTM, RNN, CNN, DBN, and AE—on a 19-class human activity recognition (HAR) dataset derived from textile-based wearable sensors. Our findings demonstrate that CNN and LSTM models outperform their counterparts, achieving exceptional accuracy in distinguishing subtle, overlapping movement patterns. This superior performance underscores the potential of advanced neural architectures to leverage complex temporal-spatial information inherent in textile sensor data. While RNN and AE models also prove effective, they require refinement for challenging activities, and DBN shows comparatively lower performance. The insights gained from this evaluation guide the development of more robust, user-centric HAR systems, opening new avenues for healthcare, sports analytics, and ergonomic applications.

Ethical Statement

The manuscript does not require any ethics committee approval.

Supporting Institution

Tubitak - Scientific and Technological Research Council of Turkey

Project Number

121E709

Thanks

This study is supported by Scientific and Technological Research Council of Turkey (TÜBİTAK) entitled "Development of Wearable Technology Product T-Shirt (Track T-shirt) and Artificial Intelligence-Based Decision Support System for Tracking and Improving Athlete Performance" with no 121E709.

References

  • 1. De Fazio, R., Mastronardi, V. M., De Vittorio, M., Visconti, P. 2023. Wearable sensors and smart devices to monitor rehabilitation parameters and sports performance: an overview. Sensors, 23(4), 1856.
  • 2. Ding, Y., Jiang, J., Wu, Y., Zhang, Y., Zhou, J., Zhang, Y., ..., Zheng, Z. 2024. Porous conductive textiles for wearable electronics. Chemical reviews, 124(4), 1535-1648.
  • 3. Yogev, D., Goldberg, T., Arami, A., Tejman-Yarden, S., Winkler, T. E., Maoz, B. M. 2023. Current state of the art and future directions for implantable sensors in medical technology: Clinical needs and engineering challenges. APL bioengineering, 7(3).
  • 4. Heikenfeld, J., Jajack, A., Rogers, J., Gutruf, P., Tian, L., Pan, T., ..., Wang, J. 2018. Wearable sensors: modalities, challenges, and prospects. Lab on a Chip, 18(2), 217-248.
  • 5. Levin, A., Gong, S., Cheng, W. 2023. Wearable smart bandage-based bio-sensors. Biosensors, 13(4), 462.
  • 6. Guinovart, T., Valdés‐Ramírez, G., Windmiller, J. R., Andrade, F. J., Wang, J. 2014. Bandage‐based wearable potentiometric sensor for monitoring wound pH. Electroanalysis, 26(6), 1345-1353.
  • 7. Chen, S., Qi, J., Fan, S., Qiao, Z., Yeo, J. C., Lim, C. T. 2021. Flexible wearable sensors for cardiovascular health monitoring. Advanced Healthcare Materials, 10(17), 2100116.
  • 8. Majumder, S., Mondal, T., Deen, M. J. 2017. Wearable sensors for remote health monitoring. Sensors, 17(1), 130.
  • 9. Banaee, H., Ahmed, M. U., Loutfi, A. 2013. Data mining for wearable sensors in health monitoring systems: a review of recent trends and challenges. Sensors, 13(12), 17472-17500.
  • 10. Lee, G., Wei, Q., Zhu, Y. 2021. Emerging wearable sensors for plant health monitoring. Advanced Functional Materials, 31(52), 2106475.
  • 11. Hassan, M. M., Uddin, M. Z., Mohamed, A., Almogren, A. 2018. A robust human activity recognition system using smartphone sensors and deep learning. Future Generation Computer Systems 81, 307-313.
  • 12. Zhang, S., Li, Y., Zhang, S., Shahabi, F., Xia, S., Deng, Y., Alshurafa, N. 2022. Deep learning in human activity recognition with wearable sensors: A review on advances. Sensors 22(4), 1476.
  • 13. Dentamaro, V., Gattulli, V., Impedovo, D., Manca, F. 2024. Human activity recognition with smartphone-integrated sensors: A survey. Expert Systems with Applications 123143.
  • 14. Saha, A., Rajak, S., Saha, J., Chowdhury, C. 2024. A survey of machine learning and meta-heuristics approaches for sensor-based human activity recognition systems. Journal of Ambient Intelligence and Humanized Computing 15(1), 29-56.
  • 15. Ghimire, S., Deo, R. C., Raj, N., Mi, J. 2019. Deep solar radiation forecasting with convolutional neural network and long short-term memory network algorithms. Applied Energy 253, 113541.
  • 16. Lai, G., Chang, W. C., Yang, Y., Liu, H. 2018, June. Modeling long-and short-term temporal patterns with deep neural networks. In The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval (pp. 95-104).
  • 17. Hassan, A., Mahmood, A. 2018. Convolutional recurrent deep learning model for sentence classification. IEEE Access 6, 13949-13957.
  • 18. Murad, A., Pyun, J. Y. 2017. Deep recurrent neural networks for human activity recognition. Sensors 17(11), 2556.
  • 19. Si, C., Chen, W., Wang, W., Wang, L., Tan, T. 2019. An attention enhanced graph convolutional lstm network for skeleton-based action recognition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 1227-1236).
  • 20. Ye, W., Cheng, J., Yang, F., Xu, Y. 2019. Two-stream convolutional network for improving activity recognition using convolutional long short-term memory networks. IEEE Access 7, 67772-67780.
  • 21. Ordóñez, F. J., Roggen, D. 2016. Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 16(1), 115.
  • 22. Huang, W., Zhang, L., Wu, H., Min, F., Song, A. 2022. Channel-Equalization-HAR: a light-weight convolutional neural network for wearable sensor based human activity recognition. IEEE Transactions on Mobile Computing 22(9), 5064-5077.
  • 23. Shi, W., Fang, X., Yang, G., Huang, J. 2022. Human activity recognition based on multichannel convolutional neural network with data augmentation. IEEE Access 10, 76596-76606.
  • 24. Wang, Y., Xu, H., Liu, Y., Wang, M., Wang, Y., Yang, Y., ..., Li, J. 2023. A novel deep multifeature extraction framework based on attention mechanism using wearable sensor data for human activity recognition. IEEE Sensors Journal 23(7), 7188-7198.
  • 25. Zhong, P., Gong, Z., Li, S., Schönlieb, C. B. 2017. Learning to diversify deep belief networks for hyperspectral image classification. IEEE Transactions on Geoscience and Remote Sensing 55(6), 3516-3530.
  • 26. Thakur, D., Biswas, S., Ho, E. S., Chattopadhyay, S. 2022. Convae-lstm: Convolutional autoencoder long short-term memory network for smartphone-based human activity recognition. IEEE Access 10, 4137-4156.
  • 27. Mohd Noor, M. H. 2021. Feature learning using convolutional denoising autoencoder for activity recognition. Neural Computing and Applications 33(17), 10909-10922.
  • 28. Nikolova, D., Vladimirov, I., Manolova, A. 2023. An Experimental Analysis of Deep Learning Models for Human Activity Recognition with Synthetic Data. In 2023 58th International Scientific Conference on Information, Communication and Energy Systems and Technologies (ICEST) (pp. 277-280). IEEE, June.
  • 29. Bento, N., Rebelo, J., Barandas, M., Carreiro, A. V., Campagner, A., Cabitza, F., Gamboa, H. 2022. Comparing handcrafted features and deep neural representations for domain generalization in human activity recognition. Sensors 22(19), 7324.
  • 30. Zhongkai, Z., Kobayashi, S., Kondo, K., Hasegawa, T., Koshino, M. 2022. A comparative study: Toward an effective convolutional neural network architecture for sensor-based human activity recognition. IEEE Access 10, 20547-20558.
  • 31. Altun, K., Barshan, B., Tunçel, O. 2010. Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recognition 43(10), 3605-3620.
  • 32. Tufek, N., Yalcin, M., Altintas, M., Kalaoglu, F., Li, Y., Bahadir, S. K. 2019. Human action recognition using deep learning methods on limited sensory data. IEEE Sensors Journal, 20(6), 3101-3112.
  • 33. Jain, A., Kanhangad, V. 2017. Human activity classification in smartphones using accelerometer and gyroscope sensors. IEEE Sensors Journal, 18(3), 1169-1177.
  • 34. Lu, W., Fan, F., Chu, J., Jing, P., Yuting, S. 2018. Wearable computing for Internet of Things: A discriminant approach for human activity recognition. IEEE Internet of Things Journal, 6(2), 2749-2759.
There are 34 citations in total.

Details

Primary Language English
Subjects Wearable Materials
Journal Section Articles
Authors

Mehmet Ali Balcı 0000-0003-1465-7153

Ömer Akgüller 0000-0002-7061-2534

Ozan Kayacan 0000-0001-9246-0730

Özgür Tamer 0000-0002-5776-6627

Aksel Çelik 0000-0003-0961-7616

Tolga Akşit 0000-0003-0223-4508

Duygu Yavuzkasap Ayakta 0000-0003-0108-9839

Aysu Ildeniz This is me 0000-0001-8299-9433

Project Number 121E709
Early Pub Date October 1, 2025
Publication Date October 25, 2025
Submission Date December 24, 2024
Acceptance Date March 26, 2025
Published in Issue Year 2025 Volume: 35 Issue: 3

Cite

APA Balcı, M. A., Akgüller, Ö., Kayacan, O., … Tamer, Ö. (2025). From Textiles to Insights: Evaluating Deep Learning Models on Wearable Sensor Data for Activity Recognition. Textile and Apparel, 35(3), 169-185. https://doi.org/10.32710/tekstilvekonfeksiyon.1606757

No part of this journal may be reproduced, stored, transmitted or disseminated in any forms or by any means without prior written permission of the Editorial Board. The views and opinions expressed here in the articles are those of the authors and are not the views of Tekstil ve Konfeksiyon and Textile and Apparel Research-Application Center.