Please use this identifier to cite or link to this item:
https://scholarbank.nus.edu.sg/handle/10635/146142
Title: | Sparsity-based deconvolution of low-dose perfusion CT using learned dictionaries | Authors: | Fang R. Chen T. Sanelli P.C. |
Issue Date: | 2012 | Publisher: | Springer Verlag | Citation: | Fang R., Chen T., Sanelli P.C. (2012). Sparsity-based deconvolution of low-dose perfusion CT using learned dictionaries. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 7510 LNCS : 272-280. ScholarBank@NUS Repository. | Abstract: | Computational tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, such as stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a novel sparsity-base deconvolution method to estimate cerebral blood flow in CTP performed at low-dose. We first built an overcomplete dictionary from high-dose perfusion maps and then performed deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on a clinical dataset of ischemic patients. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. | Source Title: | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) | URI: | http://scholarbank.nus.edu.sg/handle/10635/146142 | ISBN: | 9783642334146 | ISSN: | 03029743 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.