Please use this identifier to cite or link to this item:
|Title:||Joint spatial-spectral feature space clustering for speech activity detection from ecog signals||Authors:||Kanas, V.G.
|Keywords:||Brain-machine interfaces (BMIs)
feature space clustering
speech activity detection
|Issue Date:||2014||Citation:||Kanas, V.G., Mporas, I., Benz, H.L., Sgarbas, K.N., Bezerianos, A., Crone, N.E. (2014). Joint spatial-spectral feature space clustering for speech activity detection from ecog signals. IEEE Transactions on Biomedical Engineering 61 (4) : 1241-1250. ScholarBank@NUS Repository. https://doi.org/10.1109/TBME.2014.2298897||Abstract:||Brain-machine interfaces for speech restoration have been extensively studied for more than two decades. The success of such a system will depend in part on selecting the best brain recording sites and signal features corresponding to speech production. The purpose of this study was to detect speech activity automatically from electrocorticographic signals based on joint spatial-frequency clustering of the ECoG feature space. For this study, the ECoG signals were recorded while a subject performed two different syllable repetition tasks. We found that the optimal frequency resolution to detect speech activity from ECoG signals was 8 Hz, achieving 98.8% accuracy by employing support vector machines as a classifier. We also defined the cortical areas that held the most information about the discrimination of speech and nonspeech time intervals. Additionally, the results shed light on the distinct cortical areas associated with the two syllables repetition tasks and may contribute to the development of portable ECoG-based communication. © 1964-2012 IEEE.||Source Title:||IEEE Transactions on Biomedical Engineering||URI:||http://scholarbank.nus.edu.sg/handle/10635/128729||ISSN:||15582531||DOI:||10.1109/TBME.2014.2298897|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.