Full Text

Turn on search term navigation

© 2021. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

Given the high prevalence and detrimental effects of unintentional falls in the elderly, fall detection has become a pertinent public concern. A Fall Detection System (FDS) gathers information from sensors to distinguish falls from routine activities in order to provide immediate medical assistance. Hence, the integrity of collected data becomes imperative. Presence of missing values in data, caused by unreliable data delivery, lossy sensors, local interference and synchronization disturbances and so forth, greatly hamper the credibility and usefulness of data making it unfit for reliable fall detection. This paper presents a noise tolerant FDS performing in presence of missing values in data. The work focuses on Deep Learning (DL) particularly Recurrent Neural Networks (RNNs) with an underlying Bidirectional Long Short-Term Memory (BiLSTM) stack to implement FDS based on wearable sensors. The proposed technique is evaluated on two publicly available datasets—SisFall and UP-Fall Detection. Our system produces an accuracy of 97.21% and 97.41%, sensitivity of 96.97% and 99.77% and specificity of 93.18% and 91.45% on SisFall and UP-Fall Detection respectively, thus outperforming the existing state of the art on these benchmark datasets. The resultant outcomes suggest that the ability of BiLSTM to retain long term dependencies from past and future make it an appropriate model choice to handle missing values for wearable fall detection systems.

Details

Title
NT-FDS—A Noise Tolerant Fall Detection System Using Deep Learning on Wearable Devices
First page
2006
Publication year
2021
Publication date
2021
Publisher
MDPI AG
e-ISSN
14248220
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2501808311
Copyright
© 2021. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.