Please use this identifier to cite or link to this item: https://doi.org/10.1109/ICIP.2014.7025309
Title: An efficient method for human pointing estimation for robot interaction
Authors: Ueno S.
Naito S.
Chen T. 
Keywords: Calibration
Object Identification
Pointing Gesture
Robot Interaction
Issue Date: 2014
Publisher: Institute of Electrical and Electronics Engineers Inc.
Citation: Ueno S., Naito S., Chen T. (2014). An efficient method for human pointing estimation for robot interaction. 2014 IEEE International Conference on Image Processing, ICIP 2014 : 1545-1549. ScholarBank@NUS Repository. https://doi.org/10.1109/ICIP.2014.7025309
Abstract: In this paper, we propose an efficient calibration method to estimate the pointing direction via a human pointing gesture to facilitate robot interaction. The ways in which pointing gestures are used by humans to indicate an object are individually diverse. In addition, people do not always point at the object carefully, which means there is a divergence between the line from the eye to the tip of the index finger and the line of sight. Hence, we focus on adapting to these individual ways of pointing to improve the accuracy of target object identification by means of an effective calibration process. We model these individual ways as two offsets, the horizontal offset and the vertical offset. After locating the head and fingertip positions, we learn these offsets for each individual through a training process with the person pointing at the camera. Experimental results show that our proposed method outperforms other conventional head-hand, head-fingertip, and eye-fingertip-based pointing recognition methods.
Source Title: 2014 IEEE International Conference on Image Processing, ICIP 2014
URI: http://scholarbank.nus.edu.sg/handle/10635/146082
ISBN: 9781479957514
DOI: 10.1109/ICIP.2014.7025309
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.