Research Article
BibTex RIS Cite
Year 2021, , 197 - 205, 29.09.2021
https://doi.org/10.17350/HJSE19030000230

Abstract

References

  • Adhikari, K. (2019) ‘Computer Vision Based Posture Estimation and Fall Detection’, p. 131.
  • de Assis Neto, S. R. et al. (2019) ‘Detecting Human Activities Based on a Multimodal Sensor Data Set Using a Bidirectional Long Short-Term Memory Model: A Case Study’, in Ponce, H. et al. (eds) Challenges and Trends in Multimodal Fall Detection for Healthcare. Cham, Switzerland: Springer, pp. 31–51. doi: https://doi.org/10.1007/978-3-030-38748-8.
  • Casilari, E., Lora‐rivera, R. and García‐lagos, F. (2020) ‘A study on the application of convolutional neural networks to fall detection evaluated with multiple public datasets’, Sensors (Switzerland), 20(5). doi: 10.3390/s20051466.
  • Chelli, A. and Patzold, M. (2019) ‘A Machine Learning Approach for Fall Detection and Daily Living Activity Recognition’, IEEE Access, 7, pp. 38670–38687. doi: 10.1109/ACCESS.2019.2906693.
  • Chen, J. et al. (2005) ‘Wearable sensors for reliable fall detection’, Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings. IEEE, 7 VOLS, pp. 3551–3554. doi: 10.1109/iembs.2005.1617246.
  • Cho, H. and Yoon, S. M. (2019) ‘Applying singular value decomposition on accelerometer data for 1D convolutional neural network based fall detection’, Electronics Letters, 55(6), pp. 320–322. doi: 10.1049/el.2018.6117.
  • Daga, B. S., Ghatol, A. A. and Thakare, V. M. (2018) ‘Silhouette based human fall detection using multimodal classifiers for content based video retrieval systems’, 2017 International Conference on Intelligent Computing, Instrumentation and Control Technologies, ICICICT 2017, 2018-Janua, pp. 1409–1416. doi: 10.1109/ICICICT1.2017.8342776.
  • Delgado-Escaño, R. et al. (2020) ‘A cross-dataset deep learning-based classifier for people fall detection and identification’, Computer Methods and Programs in Biomedicine, 184. doi: 10.1016/j.cmpb.2019.105265.
  • Gupta, R. et al. (2020) ‘Compressive sensing based privacy for fall detection’, pp. 1–10. Available at: http://arxiv.org/abs/2001.03463. Han, J. et al. (2013) ‘Enhanced computer vision with Microsoft Kinect sensor: A review’, IEEE Transactions on Cybernetics. IEEE, 43(5), pp. 1318–1334. doi: 10.1109/TCYB.2013.2265378.
  • Han, Q. et al. (2020) ‘A Two-Stream Approach to Fall Detection With MobileVGG’, IEEE Access, 8, pp. 17556–17566. doi: 10.1109/access.2019.2962778.
  • Jung, H. et al. (2020) ‘Enhanced algorithm for the detection of preimpact fall for wearable airbags’, Sensors (Switzerland), 20(5). doi: 10.3390/s20051277.
  • Kang, J. M., Yoo, T. and Kim, H. C. (2006) ‘A wrist-worn integrated health monitoring instrument with a tele-reporting device for telemedicine and telecare’, IEEE Transactions on Instrumentation and Measurement, 55(5), pp. 1655–1661. doi: 10.1109/TIM.2006.881035.
  • Kerdjidj, O. et al. (2020) ‘Fall detection and human activity classification using wearable sensors and compressed sensing’, Journal of Ambient Intelligence and Humanized Computing. Springer Berlin Heidelberg, 11(1), pp. 349–361. doi: 10.1007/s12652-019-01214-4.
  • Li, Q. et al. (2009) ‘Accurate, fast fall detection using gyroscopes and accelerometer-derived posture information’, Proceedings - 2009 6th International Workshop on Wearable and Implantable Body Sensor Networks, BSN 2009. IEEE, pp. 138–143. doi: 10.1109/BSN.2009.46.
  • Martinez-Villasenor, L., Ponce, H. and Perez-Daniel, K. (2019) ‘Deep learning for multimodal fall detection’, Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics, 2019-Octob, pp. 3422–3429. doi: 10.1109/SMC.2019.8914429.
  • Mastorakis, G. and Makris, D. (2014) ‘Fall detection system using Kinect’s infrared sensor’, Journal of Real-Time Image Processing, 9(4), pp. 635–646. doi: 10.1007/s11554-012-0246-9.
  • Nait-Charif, H. and McKenna, S. J. (2004) ‘Activity summarisation and fall detection in a supportive home environment’, Proceedings - International Conference on Pattern Recognition. IEEE, 4, pp. 323–326. doi: 10.1109/ICPR.2004.1333768.
  • Nho, Y.-H., Lim, J. G. and Kwon, D.-S. (2020) ‘Cluster-Analysis-based User-Adaptive Fall Detection using Fusion of Heart Rate Sensor and Accelerometer in a Wearable Device’, IEEE Access, 4, pp. 1–1. doi: 10.1109/access.2020.2969453.
  • Nyan, M. N. et al. (2006) ‘Distinguishing fall activities from normal activities by angular rate characteristics and high-speed camera characterization’, Medical Engineering and Physics, 28(8), pp. 842–849. doi: 10.1016/j.medengphy.2005.11.008.
  • Putra, I. P. E. S. et al. (2018) ‘An event-triggered machine learning approach for accelerometer-based fall detection’, Sensors (Switzerland), 18(1), pp. 1–18. doi: 10.3390/s18010020.
  • Putra, I. P. E. S. and Vesilo, R. (2018) ‘Window-size impact on detection rate of wearable-sensor-based fall detection using supervised machine learning’, 2017 IEEE Life Sciences Conference, LSC 2017, 2018-Janua, pp. 21–26. doi: 10.1109/LSC.2017.8268134.
  • Saleh, M. and Le Bouquin Jeannès, R. (2018) ‘An efficient machine learning-based fall detection algorithm using local binary features’, European Signal Processing Conference. EURASIP, 2018-Septe, pp. 667–671. doi: 10.23919/EUSIPCO.2018.8553340.
  • Saleh, M. and Jeannes, R. L. B. (2019) ‘Elderly Fall Detection Using Wearable Sensors: A Low Cost Highly Accurate Algorithm’, IEEE Sensors Journal. IEEE, 19(8), pp. 3156–3164. doi: 10.1109/JSEN.2019.2891128.
  • Srinivasan, S. et al. (2007) ‘Towards automatic detection of falls using wireless sensors’, Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings, pp. 1379–1382. doi: 10.1109/IEMBS.2007.4352555.
  • Sucerquia, A., López, J. D. and Vargas-Bonilla, J. F. (2017) ‘SisFall: A fall and movement dataset’, Sensors (Switzerland), 17(1). doi: 10.3390/s17010198.
  • Sucerquia, A., López, J. D. and Vargas-Bonilla, J. F. (2018) ‘Real-life/real-time elderly fall detection with a triaxial accelerometer’, Sensors (Switzerland), 18(4), pp. 1–18. doi: 10.3390/s18041101.
  • United Nations (2019) World Population Prospects 2019, Department of Economic and Social Affairs. World Population Prospects 2019. Available at: http://www.ncbi.nlm.nih.gov/pubmed/12283219.
  • WHO (2007) Falls. Available at: https://www.who.int/news-room/fact-sheets/detail/falls (Accessed: 19 February 2020).
  • Wu, G. (2000) ‘Distinguishing fall activities from normal activities by velocity characteristics’, Journal of Biomechanics, 33(11), pp. 1497–1500. doi: 10.1016/S0021-9290(00)00117-2.

Elderly Fall Detection and Fall Direction Detection via Various Machine Learning Algorithms Using Wearable Sensors

Year 2021, , 197 - 205, 29.09.2021
https://doi.org/10.17350/HJSE19030000230

Abstract

The world population is aging rapidly. Some of the elderly live alone and it is observed that the elderly who live with their families frequently have to stay at home alone, especially during the working hours of adult members of the family. Falling while alone at home often results in fatal injuries and even death in elderly individuals. Fall detection systems detect falls and provide emergency healthcare services quickly. In this study, a two-step fall detection and fall direction detection system has been developed by using a public dataset and by testing 5 different machine learning algorithms comparatively. If a fall is detected in the first stage, the second stage is started and the direction of the fall is determined. In this way, the fall direction of the elderly individual can be determined for use in future researches, and a system that enables necessary measures such as opening an airbag in the direction of the fall is developed. Thus, a gradual fall detection and fall direction detection system has been developed by determining the best classifying algorithms. As a result, it has been determined that Ensemble Subspace k-NN classifier performs a little more successful classification compared to other classifiers. The classification via the test data corresponding to 30% of the total data, which was never used during the training phase, has been performed with 99.4% accuracy, and then 97.2% success has been achieved in determining the direction of falling.

References

  • Adhikari, K. (2019) ‘Computer Vision Based Posture Estimation and Fall Detection’, p. 131.
  • de Assis Neto, S. R. et al. (2019) ‘Detecting Human Activities Based on a Multimodal Sensor Data Set Using a Bidirectional Long Short-Term Memory Model: A Case Study’, in Ponce, H. et al. (eds) Challenges and Trends in Multimodal Fall Detection for Healthcare. Cham, Switzerland: Springer, pp. 31–51. doi: https://doi.org/10.1007/978-3-030-38748-8.
  • Casilari, E., Lora‐rivera, R. and García‐lagos, F. (2020) ‘A study on the application of convolutional neural networks to fall detection evaluated with multiple public datasets’, Sensors (Switzerland), 20(5). doi: 10.3390/s20051466.
  • Chelli, A. and Patzold, M. (2019) ‘A Machine Learning Approach for Fall Detection and Daily Living Activity Recognition’, IEEE Access, 7, pp. 38670–38687. doi: 10.1109/ACCESS.2019.2906693.
  • Chen, J. et al. (2005) ‘Wearable sensors for reliable fall detection’, Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings. IEEE, 7 VOLS, pp. 3551–3554. doi: 10.1109/iembs.2005.1617246.
  • Cho, H. and Yoon, S. M. (2019) ‘Applying singular value decomposition on accelerometer data for 1D convolutional neural network based fall detection’, Electronics Letters, 55(6), pp. 320–322. doi: 10.1049/el.2018.6117.
  • Daga, B. S., Ghatol, A. A. and Thakare, V. M. (2018) ‘Silhouette based human fall detection using multimodal classifiers for content based video retrieval systems’, 2017 International Conference on Intelligent Computing, Instrumentation and Control Technologies, ICICICT 2017, 2018-Janua, pp. 1409–1416. doi: 10.1109/ICICICT1.2017.8342776.
  • Delgado-Escaño, R. et al. (2020) ‘A cross-dataset deep learning-based classifier for people fall detection and identification’, Computer Methods and Programs in Biomedicine, 184. doi: 10.1016/j.cmpb.2019.105265.
  • Gupta, R. et al. (2020) ‘Compressive sensing based privacy for fall detection’, pp. 1–10. Available at: http://arxiv.org/abs/2001.03463. Han, J. et al. (2013) ‘Enhanced computer vision with Microsoft Kinect sensor: A review’, IEEE Transactions on Cybernetics. IEEE, 43(5), pp. 1318–1334. doi: 10.1109/TCYB.2013.2265378.
  • Han, Q. et al. (2020) ‘A Two-Stream Approach to Fall Detection With MobileVGG’, IEEE Access, 8, pp. 17556–17566. doi: 10.1109/access.2019.2962778.
  • Jung, H. et al. (2020) ‘Enhanced algorithm for the detection of preimpact fall for wearable airbags’, Sensors (Switzerland), 20(5). doi: 10.3390/s20051277.
  • Kang, J. M., Yoo, T. and Kim, H. C. (2006) ‘A wrist-worn integrated health monitoring instrument with a tele-reporting device for telemedicine and telecare’, IEEE Transactions on Instrumentation and Measurement, 55(5), pp. 1655–1661. doi: 10.1109/TIM.2006.881035.
  • Kerdjidj, O. et al. (2020) ‘Fall detection and human activity classification using wearable sensors and compressed sensing’, Journal of Ambient Intelligence and Humanized Computing. Springer Berlin Heidelberg, 11(1), pp. 349–361. doi: 10.1007/s12652-019-01214-4.
  • Li, Q. et al. (2009) ‘Accurate, fast fall detection using gyroscopes and accelerometer-derived posture information’, Proceedings - 2009 6th International Workshop on Wearable and Implantable Body Sensor Networks, BSN 2009. IEEE, pp. 138–143. doi: 10.1109/BSN.2009.46.
  • Martinez-Villasenor, L., Ponce, H. and Perez-Daniel, K. (2019) ‘Deep learning for multimodal fall detection’, Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics, 2019-Octob, pp. 3422–3429. doi: 10.1109/SMC.2019.8914429.
  • Mastorakis, G. and Makris, D. (2014) ‘Fall detection system using Kinect’s infrared sensor’, Journal of Real-Time Image Processing, 9(4), pp. 635–646. doi: 10.1007/s11554-012-0246-9.
  • Nait-Charif, H. and McKenna, S. J. (2004) ‘Activity summarisation and fall detection in a supportive home environment’, Proceedings - International Conference on Pattern Recognition. IEEE, 4, pp. 323–326. doi: 10.1109/ICPR.2004.1333768.
  • Nho, Y.-H., Lim, J. G. and Kwon, D.-S. (2020) ‘Cluster-Analysis-based User-Adaptive Fall Detection using Fusion of Heart Rate Sensor and Accelerometer in a Wearable Device’, IEEE Access, 4, pp. 1–1. doi: 10.1109/access.2020.2969453.
  • Nyan, M. N. et al. (2006) ‘Distinguishing fall activities from normal activities by angular rate characteristics and high-speed camera characterization’, Medical Engineering and Physics, 28(8), pp. 842–849. doi: 10.1016/j.medengphy.2005.11.008.
  • Putra, I. P. E. S. et al. (2018) ‘An event-triggered machine learning approach for accelerometer-based fall detection’, Sensors (Switzerland), 18(1), pp. 1–18. doi: 10.3390/s18010020.
  • Putra, I. P. E. S. and Vesilo, R. (2018) ‘Window-size impact on detection rate of wearable-sensor-based fall detection using supervised machine learning’, 2017 IEEE Life Sciences Conference, LSC 2017, 2018-Janua, pp. 21–26. doi: 10.1109/LSC.2017.8268134.
  • Saleh, M. and Le Bouquin Jeannès, R. (2018) ‘An efficient machine learning-based fall detection algorithm using local binary features’, European Signal Processing Conference. EURASIP, 2018-Septe, pp. 667–671. doi: 10.23919/EUSIPCO.2018.8553340.
  • Saleh, M. and Jeannes, R. L. B. (2019) ‘Elderly Fall Detection Using Wearable Sensors: A Low Cost Highly Accurate Algorithm’, IEEE Sensors Journal. IEEE, 19(8), pp. 3156–3164. doi: 10.1109/JSEN.2019.2891128.
  • Srinivasan, S. et al. (2007) ‘Towards automatic detection of falls using wireless sensors’, Annual International Conference of the IEEE Engineering in Medicine and Biology - Proceedings, pp. 1379–1382. doi: 10.1109/IEMBS.2007.4352555.
  • Sucerquia, A., López, J. D. and Vargas-Bonilla, J. F. (2017) ‘SisFall: A fall and movement dataset’, Sensors (Switzerland), 17(1). doi: 10.3390/s17010198.
  • Sucerquia, A., López, J. D. and Vargas-Bonilla, J. F. (2018) ‘Real-life/real-time elderly fall detection with a triaxial accelerometer’, Sensors (Switzerland), 18(4), pp. 1–18. doi: 10.3390/s18041101.
  • United Nations (2019) World Population Prospects 2019, Department of Economic and Social Affairs. World Population Prospects 2019. Available at: http://www.ncbi.nlm.nih.gov/pubmed/12283219.
  • WHO (2007) Falls. Available at: https://www.who.int/news-room/fact-sheets/detail/falls (Accessed: 19 February 2020).
  • Wu, G. (2000) ‘Distinguishing fall activities from normal activities by velocity characteristics’, Journal of Biomechanics, 33(11), pp. 1497–1500. doi: 10.1016/S0021-9290(00)00117-2.
There are 29 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Research Articles
Authors

Sıtkı Kocaoğlu 0000-0003-1048-9623

Yılmaz Güven 0000-0002-8205-2490

Publication Date September 29, 2021
Submission Date February 2, 2021
Published in Issue Year 2021

Cite

Vancouver Kocaoğlu S, Güven Y. Elderly Fall Detection and Fall Direction Detection via Various Machine Learning Algorithms Using Wearable Sensors. Hittite J Sci Eng. 2021;8(3):197-205.

Hittite Journal of Science and Engineering is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY NC).