Please use this identifier to cite or link to this item: https://doi.org/10.1109/TIE.2005.847577
Title: Task-oriented developmental learning for humanoid robots
Authors: Tan, K.C. 
Chen, Y.J.
Tan, K.K. 
Lee, T.H. 
Keywords: Humanoid robots
Learning systems
Task representation
Issue Date: Jun-2005
Citation: Tan, K.C., Chen, Y.J., Tan, K.K., Lee, T.H. (2005-06). Task-oriented developmental learning for humanoid robots. IEEE Transactions on Industrial Electronics 52 (3) : 906-914. ScholarBank@NUS Repository. https://doi.org/10.1109/TIE.2005.847577
Abstract: This paper presents a new approach of task-oriented developmental learning for humanoid robots. It is capable of setting up multiple tasks representation automatically using real-time experiences, thereby enabling a robot to handle various tasks concurrently without the need of predefining the tasks. In the approach, an evolvable partitioned tree structure is used for task representation knowledgebase that is partitioned into different task domains. The search/update of task knowledge is focused on a particular task branch, without considering the whole task knowledgebase that is often large and time consuming in the process. A prototype of the proposed task-oriented developmental learning is designed and implemented using a Khepera robot. Experimental results show that the robot can redirect itself to new tasks through interactions with the environment, and a learned task can be easily updated in order to meet varying specifications in the real world. © 2005 IEEE.
Source Title: IEEE Transactions on Industrial Electronics
URI: http://scholarbank.nus.edu.sg/handle/10635/57594
ISSN: 02780046
DOI: 10.1109/TIE.2005.847577
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

SCOPUSTM   
Citations

11
checked on Sep 25, 2018

WEB OF SCIENCETM
Citations

8
checked on Sep 17, 2018

Page view(s)

33
checked on Sep 22, 2018

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.