Please use this identifier to cite or link to this item:
|Title:||Motion intent recognition for control of a lower extremity assistive device (LEAD)|
lower extremities rehabilitation
|Source:||Shen, B.,Li, J.,Bai, F.,Chew, C.-M. (2013). Motion intent recognition for control of a lower extremity assistive device (LEAD). 2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013 : 926-931. ScholarBank@NUS Repository. https://doi.org/10.1109/ICMA.2013.6618039|
|Abstract:||This paper presents a motion intent recognition method to control a wearable lower extremity assistive device (LEAD) intended to aid stroke patient during activities of daily living (ADL) or rehabilitation. The main goal is to identify user's intended motion based on sensor readings from the limb attached to the assistive device, so as to execute the right control actions to aid the user in his intended action effectively. A database of a healthy subject performing various motion tasks is collected. Subsequently, the features of the signals are extracted and Principal Component Analysis (PCA) is performed to reduce the number of dimensions. Using the transformed signal, a multi-class Support Vector Machine (SVM) with Radial Basis function (RBF) kernel is trained to classify the different motion patterns. A Nelder-Mead optimization algorithm is used select the appropriate parameters for each SVM. Test results shows that the SVM can correctly classify each motion pattern with an average accuracy rate of 95.8±4.1%. An offline classification result of a healthy subject performing a series of motion task while wearing the LEAD shows that the proposed method can effectively recognize different motion intent of the user. © 2013 IEEE.|
|Source Title:||2013 IEEE International Conference on Mechatronics and Automation, IEEE ICMA 2013|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 13, 2017
checked on Dec 9, 2017
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.