Please use this identifier to cite or link to this item:
https://doi.org/10.3389/fnins.2014.00373
DC Field | Value | |
---|---|---|
dc.title | Hybrid fNIRS-EEG based classification of auditory and visual perception processes | |
dc.contributor.author | Putze, F | |
dc.contributor.author | Hesslinger, S | |
dc.contributor.author | Tse, C.-Y | |
dc.contributor.author | Huang, Y | |
dc.contributor.author | Herff, C | |
dc.contributor.author | Guan, C | |
dc.contributor.author | Schultz, T | |
dc.date.accessioned | 2020-09-14T08:22:20Z | |
dc.date.available | 2020-09-14T08:22:20Z | |
dc.date.issued | 2014 | |
dc.identifier.citation | Putze, F, Hesslinger, S, Tse, C.-Y, Huang, Y, Herff, C, Guan, C, Schultz, T (2014). Hybrid fNIRS-EEG based classification of auditory and visual perception processes. Frontiers in Neuroscience 8 (OCT) : Article 373. ScholarBank@NUS Repository. https://doi.org/10.3389/fnins.2014.00373 | |
dc.identifier.issn | 1662-4548 | |
dc.identifier.uri | https://scholarbank.nus.edu.sg/handle/10635/176173 | |
dc.description.abstract | For multimodal Human-Computer Interaction (HCI), it is very useful to identify the modalities on which the user is currently processing information. This would enable a system to select complementary output modalities to reduce the user's workload. In this paper, we develop a hybrid Brain-Computer Interface (BCI) which uses Electroencephalography (EEG) and functional Near Infrared Spectroscopy (fNIRS) to discriminate and detect visual and auditory stimulus processing. We describe the experimental setup we used for collection of our data corpus with 12 subjects. On this data, we performed cross-validation evaluation, of which we report accuracy for different classification conditions. The results show that the subject-dependent systems achieved a classification accuracy of 97.8% for discriminating visual and auditory perception processes from each other and a classification accuracy of up to 94.8% for detecting modality specific processes independently of other cognitive activity. The same classification conditions could also be discriminated in a subject-independent fashion with accuracy of up to 94.6% and 86.7%, respectively. We also look at the contributions of the two signal types and show that the fusion of classifiers using different features significantly increases accuracy. © 2014 Putze, Hesslinger, Tse, Huang, Herff, Guan and Schultz. | |
dc.source | Unpaywall 20200831 | |
dc.subject | adult | |
dc.subject | Article | |
dc.subject | auditory stimulation | |
dc.subject | brain computer interface | |
dc.subject | cerebral oximeter | |
dc.subject | classification | |
dc.subject | cognition | |
dc.subject | electroencephalography | |
dc.subject | female | |
dc.subject | functional neuroimaging | |
dc.subject | hearing | |
dc.subject | human | |
dc.subject | human experiment | |
dc.subject | male | |
dc.subject | near infrared spectroscopy | |
dc.subject | normal human | |
dc.subject | validation study | |
dc.subject | vision | |
dc.subject | visual stimulation | |
dc.type | Article | |
dc.contributor.department | ELECTRICAL AND COMPUTER ENGINEERING | |
dc.contributor.department | TEMASEK LABORATORIES | |
dc.description.doi | 10.3389/fnins.2014.00373 | |
dc.description.sourcetitle | Frontiers in Neuroscience | |
dc.description.volume | 8 | |
dc.description.issue | OCT | |
dc.description.page | Article 373 | |
dc.published.state | Published | |
Appears in Collections: | Elements Staff Publications |
Show simple item record
Files in This Item:
File | Description | Size | Format | Access Settings | Version | |
---|---|---|---|---|---|---|
10_3389_fnins_2014_00373.pdf | 3.2 MB | Adobe PDF | OPEN | None | View/Download |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.