Please use this identifier to cite or link to this item:
Title: Landmark extraction and state estimation for UAV operation in forest
Authors: Cui, J.
Wang, F.
Dong, X.
Ang Zong Yao, K.
Chen, B.M.
Lee, T.H. 
Keywords: Feature Extraction
Scan Matching
Scan Segmentation
State Estimation
Issue Date: 18-Oct-2013
Citation: Cui, J.,Wang, F.,Dong, X.,Ang Zong Yao, K.,Chen, B.M.,Lee, T.H. (2013-10-18). Landmark extraction and state estimation for UAV operation in forest. Chinese Control Conference, CCC : 5210-5215. ScholarBank@NUS Repository.
Abstract: In this paper, we present the essential problems and solutions about feature extraction and state estimation for UAV operation in GPS-denied environment - forest. Tree trunks, which are selected to be the features existing in the operational environment, help the UAV localize itself in an unknown forest. The UAV is equipped with an inertial measurement unit (IMU) and a scanning laser rangefinder (LRF). The raw laser data is preprocessed to remove outliers. A clustering algorithm based on the spatial discontinuity of the consecutive laser beams is applied to the preprocessed laser scan data. The clustered segments are characterized by a group of geometric descriptors, which are subjected to a series of thresholds to remove the false tree stems such as possible ground hit and bushes. To ensure fast and correct data association, the extracted features in consecutive scans are first aligned in rotation using the yaw angle measurement of IMU. Then the scan matching algorithm is applied to estimate the incremental rotation and translation of the UAV. The rotation and translation are fed into an IMU-driven Kalman filter to estimate the UAV position and velocity. © 2013 TCCT, CAA.
Source Title: Chinese Control Conference, CCC
ISBN: 9789881563835
ISSN: 19341768
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Page view(s)

checked on Oct 27, 2019

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.