Please use this identifier to cite or link to this item:
Title: Acoustic diagnosis of aortic stenosis
Authors: Sun, Z.
Poh, K.-K.
Ling, L.-H.
Hong, G.S. 
Chew, C.-H. 
Issue Date: 2005
Citation: Sun, Z.,Poh, K.-K.,Ling, L.-H.,Hong, G.S.,Chew, C.-H. (2005). Acoustic diagnosis of aortic stenosis. Journal of Heart Valve Disease 14 (2) : 186-194. ScholarBank@NUS Repository.
Abstract: Background and aim of the study: Phonocardiography is a promising non-invasive diagnostic tool for the assessment of aortic stenosis (AS), and time-frequency representation is a potential tool to extract information from the phonocardiogram (PCG) signal. The study aim was to develop an acoustical method to predict the severity of AS. Methods: Normalized continuous wavelet transform (NCWT) and fast Fourier Transform (FFT) were used to perform a spectral analysis of the PCG signal. A multi-peak detection algorithm was developed to determine the dominant frequency (DF) of systolic murmurs (SM). The spectral ratio of the SM, integration of the NCWT of SM (SI), and combined information of SM and second heart sound, were also calculated. Results: The DF correlated best with the hemodynamic data: r = -0.72 with aortic valve (AV) area; r = 0.63 with maximal blood velocity through the AV; and r = 0.57 with mean pressure gradient across the AV. Based on DF and SI data, the study subjects (n = 59) were classified into three categories: severe AS; moderate AS; and other cases. The acoustical and echo classifications were in agreement in 50 subjects (85%). Conclusion: The acoustical method developed cannot predict accurately the severity of AS, but is valuable when conducting a screening classification before an invasive method is used. © Copyright by ICR Publishers 2005.
Source Title: Journal of Heart Valve Disease
ISSN: 09668519
Appears in Collections:Staff Publications

Show full item record
Files in This Item:
There are no files associated with this item.

Page view(s)

checked on Apr 13, 2021

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.