Full text

Turn on search term navigation

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

The use of wearable sensors allows continuous recordings of physical activity from participants in free-living or at-home clinical studies. The large amount of data collected demands automatic analysis pipelines to extract gait parameters that can be used as clinical endpoints. We introduce a deep learning-based automatic pipeline for wearables that processes tri-axial accelerometry data and extracts gait events—bout segmentation, initial contact (IC), and final contact (FC)—from a single sensor located at either the lower back (near L5), shin or wrist. The gait events detected are posteriorly used for gait parameter estimation, such as step time, length, and symmetry. We report results from a leave-one-subject-out (LOSO) validation on a pilot study dataset of five participants clinically diagnosed with Parkinson’s disease (PD) and six healthy controls (HC). Participants wore sensors at three body locations and walked on a pressure-sensing walkway to obtain reference gait data. Mean absolute errors (MAE) for the IC events ranged from 22.82 to 33.09 milliseconds (msecs) for the lower back sensor while for the shin and wrist sensors, MAE ranges were 28.56–64.66 and 40.19–72.50 msecs, respectively. For the FC-event detection, MAE ranges were 29.06–48.42, 40.19–72.70 and 36.06–60.18 msecs for the lumbar, wrist and shin sensors, respectively. Intraclass correlation coefficients, ICC(2,k), between the estimated parameters and the reference data resulted in good-to-excellent agreement (ICC ≥ 0.84) for the lumbar and shin sensors, excluding the double support time (ICC = 0.37 lumbar and 0.38 shin) and swing time (ICC = 0.55 lumbar and 0.59 shin). The wrist sensor also showed good agreements, but the ICCs were lower overall than for the other two sensors. Our proposed analysis pipeline has the potential to extract up to 100 gait-related parameters, and we expect our contribution will further support developments in the fields of wearable sensors, digital health, and remote monitoring in clinical trials.

Details

Title
An Automatic Gait Analysis Pipeline for Wearable Sensors: A Pilot Study in Parkinson’s Disease
Author
Peraza, Luis R 1   VIAFID ORCID Logo  ; Kinnunen, Kirsi M 1 ; McNaney, Roisin 2   VIAFID ORCID Logo  ; Craddock, Ian J 3 ; Whone, Alan L 4 ; Morgan, Catherine 4   VIAFID ORCID Logo  ; Joules, Richard 1 ; Wolz, Robin 5 

 IXICO, London EC1A 9PN, UK; [email protected] (L.R.P.); [email protected] (K.M.K.); [email protected] (R.J.); [email protected] (R.W.) 
 Department of Human Centred Computing, Monash University, Clayton, VIC 3800, Australia; [email protected] 
 Electrical and Electronic Engineering, School of Computer Science, University of Bristol, Bristol BS8 1QU, UK; [email protected] 
 Translational Health Sciences, University of Bristol Medical School, Bristol BS8 1QU, UK; [email protected]; Movement Disorders Group, North Bristol NHS Trust, Westbury on Trym, Bristol BS10 5NB, UK 
 IXICO, London EC1A 9PN, UK; [email protected] (L.R.P.); [email protected] (K.M.K.); [email protected] (R.J.); [email protected] (R.W.); Department of Computing, Imperial College London, London SW7 2AZ, UK 
First page
8286
Publication year
2021
Publication date
2021
Publisher
MDPI AG
e-ISSN
14248220
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2612857262
Copyright
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.