Please use this identifier to cite or link to this item:
|Title:||ANN based internal model approach to motor learning for humanoid robot||Authors:||Xu, J.-X.
Multiple internal model
Spacial and temporal scalabilities
|Issue Date:||2006||Citation:||Xu, J.-X.,Wang, W.,Vadakkepat, P.,Yee, L.W. (2006). ANN based internal model approach to motor learning for humanoid robot. IEEE International Conference on Neural Networks - Conference Proceedings : 4179-4186. ScholarBank@NUS Repository.||Abstract:||In this paper, we present an approach to motor skill learning based on internal models. By pursuing the temporal and spatial scalability of internal models, we first investigate the possibility of generating similar movement patterns directly via the same internal model with the minimum changes in the internal model parameters, and avoid the reinforcement learning. Next, we consider more complex movements for which different internal models are needed. Based on the task decomposition, all movements can be classified into the sequential and parallel DMPs. The former requires a number of IMs to work sequentially so that a sophisticated motor behavior can be performed. The latter also requires a number of IMs to work in parallel to generate the needed movement patterns. To mimic the human limb behavior, a two-link robot arm is used as the first prototype to perform the motor learning process of letter writing. A FUJITSU HOAP-1 humanoid robot is used as the second prototype and the upper limb movement is conducted in real-time, which further validates the effectiveness of multiple internal model approach for motor learning. © 2006 IEEE.||Source Title:||IEEE International Conference on Neural Networks - Conference Proceedings||URI:||http://scholarbank.nus.edu.sg/handle/10635/69416||ISBN:||0780394909||ISSN:||10987576|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Dec 29, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.