Please use this identifier to cite or link to this item:
|Title:||Boundary following and globally convergent path planning using instant goals||Authors:||Ge, S.S.
Al Mamun, A.
Sensor-based path planning
|Issue Date:||Apr-2005||Citation:||Ge, S.S., Lai, X., Al Mamun, A. (2005-04). Boundary following and globally convergent path planning using instant goals. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 35 (2) : 240-254. ScholarBank@NUS Repository. https://doi.org/10.1109/TSMCB.2004.842368||Abstract:||In this paper, an Instant Goal approach is proposed for collision-free boundary following of obstacles of arbitrary shape and globally convergent path planning in unknown environments. Firstly, for effective knowledge representation and manipulation, a vector representation is presented, which not only saves much space but also conforms to the physical properties of range sensors. Secondly, the concept of Instant Goals is introduced enabling the robot to perform boundary following in a "natural" human-like manner, with additional measures taken to ensure that the robot is moving "forward" along the boundary, even if the obstacle is of arbitrary shape and disturbing obstacles are present. Collision checking is performed simultaneously and, when needed, collision avoidance is efficiently incorporated in. Based on the approach of boundary following, a realistic sensor-based path planner with global convergence property is designed for the robot capable of acquiring discrete and noisy range data. Realistic simulation experiments validate the effectiveness of the proposed approaches. © 2005 IEEE.||Source Title:||IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics||URI:||http://scholarbank.nus.edu.sg/handle/10635/55225||ISSN:||10834419||DOI:||10.1109/TSMCB.2004.842368|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Apr 17, 2019
WEB OF SCIENCETM
checked on Apr 10, 2019
checked on Apr 18, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.