Please use this identifier to cite or link to this item: https://doi.org/10.1109/TITB.2011.2159122
DC FieldValue
dc.titleUbiquitous human upper-limb motion estimation using wearable sensors
dc.contributor.authorZhang, Z.-Q.
dc.contributor.authorWong Sr., W.-C.
dc.contributor.authorWu, J.-K.
dc.date.accessioned2014-06-17T03:09:35Z
dc.date.available2014-06-17T03:09:35Z
dc.date.issued2011-07
dc.identifier.citationZhang, Z.-Q., Wong Sr., W.-C., Wu, J.-K. (2011-07). Ubiquitous human upper-limb motion estimation using wearable sensors. IEEE Transactions on Information Technology in Biomedicine 15 (4) : 513-521. ScholarBank@NUS Repository. https://doi.org/10.1109/TITB.2011.2159122
dc.identifier.issn10897771
dc.identifier.urihttp://scholarbank.nus.edu.sg/handle/10635/57736
dc.description.abstractHuman motion capture technologies have been widely used in a wide spectrum of applications, including interactive game and learning, animation, film special effects, health care, navigation, and so on. The existing human motion capture techniques, which use structured multiple high-resolution cameras in a dedicated studio, are complicated and expensive. With the rapid development of microsensors-on-chip, human motion capture using wearable microsensors has become an active research topic. Because of the agility in movement, upper-limb motion estimation has been regarded as the most difficult problem in human motion capture. In this paper, we take the upper limb as our research subject and propose a novel ubiquitous upper-limb motion estimation algorithm, which concentrates on modeling the relationship between upper-arm movement and forearm movement. A link structure with 5 degrees of freedom (DOF) is proposed to model the human upper-limb skeleton structure. Parameters are defined according to Denavit-Hartenberg convention, forward kinematics equations are derived, and an unscented Kalman filter is deployed to estimate the defined parameters. The experimental results have shown that the proposed upper-limb motion capture and analysis algorithm outperforms other fusion methods and provides accurate results in comparison to the BTS optical motion tracker. © 2011 IEEE.
dc.description.urihttp://libproxy1.nus.edu.sg/login?url=http://dx.doi.org/10.1109/TITB.2011.2159122
dc.sourceScopus
dc.subjectBody sensor network
dc.subjectforward kinematics
dc.subjectKalman filter
dc.subjectubiquitous motion modeling and estimation
dc.typeArticle
dc.contributor.departmentELECTRICAL & COMPUTER ENGINEERING
dc.description.doi10.1109/TITB.2011.2159122
dc.description.sourcetitleIEEE Transactions on Information Technology in Biomedicine
dc.description.volume15
dc.description.issue4
dc.description.page513-521
dc.description.codenITIBF
dc.identifier.isiut000293660300003
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.