Please use this identifier to cite or link to this item: https://doi.org/10.1016/j.jbiomech.2005.08.014
Title: Classification of gait patterns in the time-frequency domain
Authors: Nyan, M.N. 
Tay, F.E.H. 
Seah, K.H.W. 
Sitoh, Y.Y.
Keywords: Daily activities
Direct spatial correlation
Elderly
Fall
Gait patterns
Segment extraction
Issue Date: 2006
Citation: Nyan, M.N., Tay, F.E.H., Seah, K.H.W., Sitoh, Y.Y. (2006). Classification of gait patterns in the time-frequency domain. Journal of Biomechanics 39 (14) : 2647-2656. ScholarBank@NUS Repository. https://doi.org/10.1016/j.jbiomech.2005.08.014
Abstract: This paper describes the classification of gait patterns among descending stairs, ascending stairs and level walking activities using accelerometers arranged in antero-posterior and vertical direction on the shoulder of a garment. Gait patterns in continuous accelerometer records were classified in two steps. In the first step, direct spatial correlation of discrete dyadic wavelet coefficients was applied to separate the segments of gait patterns in the continuous accelerometer record. Compared to the reference system, averaged absolute error 0.387 s for ascending stairs and 0.404 s for descending stairs were achieved. The overall sensitivity and specificity of ascending stairs were 98.79% and 99.52%, and those of descending stairs were 97.35% and 99.62%. In the second step, powers of wavelet coefficients of 2 s time duration from separated segments of vertical and antero-posterior acceleration signals were used as features in classification. Our results proved a reliable technique of measuring gait patterns during physical activity. © 2005 Elsevier Ltd. All rights reserved.
Source Title: Journal of Biomechanics
URI: http://scholarbank.nus.edu.sg/handle/10635/59711
ISSN: 00219290
DOI: 10.1016/j.jbiomech.2005.08.014
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.