Please use this identifier to cite or link to this item:
https://doi.org/10.1016/j.media.2013.02.005
Title: | Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning | Authors: | Fang R. Chen T. Sanelli P.C. |
Keywords: | Computed tomography perfusion Deconvolution algorithm Online dictionary learning Radiation dosage Sparse representation |
Issue Date: | 2013 | Citation: | Fang R., Chen T., Sanelli P.C. (2013). Towards robust deconvolution of low-dose perfusion CT: Sparse perfusion deconvolution using online dictionary learning. Medical Image Analysis 17 (4) : 417-428. ScholarBank@NUS Repository. https://doi.org/10.1016/j.media.2013.02.005 | Abstract: | Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. | Source Title: | Medical Image Analysis | URI: | http://scholarbank.nus.edu.sg/handle/10635/146106 | ISSN: | 13618415 | DOI: | 10.1016/j.media.2013.02.005 |
Appears in Collections: | Staff Publications |
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.