Please use this identifier to cite or link to this item:
Title: Ambient noise imaging in warm shallow waters; Robust statistical algorithms and range estimation
Authors: Chitre, M. 
Kuselan, S. 
Pallayil, V. 
Issue Date: Aug-2012
Citation: Chitre, M., Kuselan, S., Pallayil, V. (2012-08). Ambient noise imaging in warm shallow waters; Robust statistical algorithms and range estimation. Journal of the Acoustical Society of America 132 (2) : 838-847. ScholarBank@NUS Repository.
Abstract: The high frequency ambient noise in warm shallow waters is dominated by snapping shrimp. The loud snapping noises they produce are impulsive and broadband. As the noise propagates through the water, it interacts with the seabed, sea surface, and submerged objects. An array of acoustic pressure sensors can produce images of the submerged objects using this noise as the source of acoustic illumination. This concept is called ambient noise imaging (ANI) and was demonstrated using ADONIS, an ANI camera developed at the Scripps Institution of Oceanography. To overcome some of the limitations of ADONIS, a second generation ANI camera (ROMANIS) was developed at the National University of Singapore. The acoustic time series recordings made by ROMANIS during field experiments in Singapore show that the ambient noise is well modeled by a symmetric α-stable (S α S) distribution. As high-order moments of S α S distributions generally do not converge, ANI algorithms based on low-order moments and fractiles are developed and demonstrated. By localizing nearby snaps and identifying the echoes from an object, the range to the object can be passively estimated. This technique is also demonstrated using the data collected with ROMANIS. © 2012 Acoustical Society of America.
Source Title: Journal of the Acoustical Society of America
ISSN: 00014966
DOI: 10.1121/1.4733553
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.