Please use this identifier to cite or link to this item: https://doi.org/10.3389/fnins.2015.00522
Title: Neuromorphic event-based 3D pose estimation
Authors: Valeiras, D.R
Orchard, G 
Ieng, S.-H
Benosman, R.B
Keywords: accuracy
algorithm
Article
computer program
event based three dimensional pose estimation algorithm
image analysis
image processing
image quality
mathematical analysis
mathematical model
methodology
process design
three dimensional imaging
velocity
Issue Date: 2016
Citation: Valeiras, D.R, Orchard, G, Ieng, S.-H, Benosman, R.B (2016). Neuromorphic event-based 3D pose estimation. Frontiers in Neuroscience 9 (JAN) : 522. ScholarBank@NUS Repository. https://doi.org/10.3389/fnins.2015.00522
Rights: Attribution 4.0 International
Abstract: Pose estimation is a fundamental step in many artificial vision tasks. It consists of estimating the 3D pose of an object with respect to a camera from the object's 2D projection. Current state of the art implementations operate on images. These implementations are computationally expensive, especially for real-time applications. Scenes with fast dynamics exceeding 30-60 Hz can rarely be processed in real-time using conventional hardware. This paper presents a new method for event-based 3D object pose estimation, making full use of the high temporal resolution (1 μs) of asynchronous visual events output from a single neuromorphic camera. Given an initial estimate of the pose, each incoming event is used to update the pose by combining both 3D and 2D criteria. We show that the asynchronous high temporal resolution of the neuromorphic camera allows us to solve the problem in an incremental manner, achieving real-time performance at an update rate of several hundreds kHz on a conventional laptop. We show that the high temporal resolution of neuromorphic cameras is a key feature for performing accurate pose estimation. Experiments are provided showing the performance of the algorithm on real data, including fast moving objects, occlusions, and cases where the neuromorphic camera and the object are both in motion. © 2016 Reverter Valeiras, Orchard, Ieng and Benosman.
Source Title: Frontiers in Neuroscience
URI: https://scholarbank.nus.edu.sg/handle/10635/183726
ISSN: 16624548
DOI: 10.3389/fnins.2015.00522
Rights: Attribution 4.0 International
Appears in Collections:Elements
Staff Publications

Show full item record
Files in This Item:
File Description SizeFormatAccess SettingsVersion 
10_3389_fnins_2015_00522.pdf3.22 MBAdobe PDF

OPEN

NoneView/Download

Google ScholarTM

Check

Altmetric


This item is licensed under a Creative Commons License Creative Commons