Please use this identifier to cite or link to this item:
DC FieldValue
dc.titleHuman-aided robotic grasping
dc.contributor.authorChen, N.
dc.contributor.authorChew, C.-M.
dc.contributor.authorTee, K.P.
dc.contributor.authorHan, B.S.
dc.identifier.citationChen, N.,Chew, C.-M.,Tee, K.P.,Han, B.S. (2012). Human-aided robotic grasping. Proceedings - IEEE International Workshop on Robot and Human Interactive Communication : 75-80. ScholarBank@NUS Repository. <a href="" target="_blank"></a>
dc.description.abstractIn order to provide a user-friendly system with simple operation command to grasp different objects successfully, this paper describes a combined approach of real time remote vision-based teleoperation and autonomy for a human-aided robotic grasping. In the teleoperation process, motion tracking is carried out by Kinect in real time to detect the positions of the human shoulder, elbow and hand joints such that the robot can imitate the human. Hand gestures are recognized and used to activate autonomous grasping, which can save time and generate more natural grasping poses. In our system, the robot fulfills some special tasks such as picking up objects using easy commands with Kinect as object sensor. Experiment results show that it is effective and user-friendly. © 2012 IEEE.
dc.typeConference Paper
dc.contributor.departmentMECHANICAL ENGINEERING
dc.description.sourcetitleProceedings - IEEE International Workshop on Robot and Human Interactive Communication
Appears in Collections:Staff Publications

Show simple item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.