Please use this identifier to cite or link to this item:
|Title:||Vision-based multi-agent cooperative target search|
|Source:||Hu, J.,Xie, L.,Xu, J. (2012). Vision-based multi-agent cooperative target search. 2012 12th International Conference on Control, Automation, Robotics and Vision, ICARCV 2012 : 895-900. ScholarBank@NUS Repository. https://doi.org/10.1109/ICARCV.2012.6485276|
|Abstract:||This paper addresses vision-based cooperative search for multiple mobile ground targets by a group of unmanned aerial vehicles (UAVs) with limited sensing and communication capabilities. The airborne camera on each UAV has a limited field of view and its target discriminability varies as a function of altitude. First, a general target detection probability model is built based on the physical imaging process of a camera. By dividing the whole surveillance region into cells, a probability map can be formed for each UAV indicating the probability of target existence within each cell. Then, we propose a distributed probability map updating model which includes the fusion of measurement information, information sharing among neighboring agents, information decaying and transmission due to environmental changes such as the target movement. Furthermore, we formulate the target search problem by multiple agents as a cooperative coverage control problem by optimizing the collective coverage area and the detection performance. The proposed map updating model and the cooperative control scheme are distributed, i.e., assuming that each agent only communicates with its neighbors within its communication range. Finally, the effectiveness of the proposed algorithms is illustrated by simulation. © 2012 IEEE.|
|Source Title:||2012 12th International Conference on Control, Automation, Robotics and Vision, ICARCV 2012|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Mar 7, 2018
checked on Mar 9, 2018
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.