Please use this identifier to cite or link to this item:
https://doi.org/10.3389/fnins.2014.00373
Title: | Hybrid fNIRS-EEG based classification of auditory and visual perception processes | Authors: | Putze, F Hesslinger, S Tse, C.-Y Huang, Y Herff, C Guan, C Schultz, T |
Keywords: | adult Article auditory stimulation brain computer interface cerebral oximeter classification cognition electroencephalography female functional neuroimaging hearing human human experiment male near infrared spectroscopy normal human validation study vision visual stimulation |
Issue Date: | 2014 | Citation: | Putze, F, Hesslinger, S, Tse, C.-Y, Huang, Y, Herff, C, Guan, C, Schultz, T (2014). Hybrid fNIRS-EEG based classification of auditory and visual perception processes. Frontiers in Neuroscience 8 (OCT) : Article 373. ScholarBank@NUS Repository. https://doi.org/10.3389/fnins.2014.00373 | Abstract: | For multimodal Human-Computer Interaction (HCI), it is very useful to identify the modalities on which the user is currently processing information. This would enable a system to select complementary output modalities to reduce the user's workload. In this paper, we develop a hybrid Brain-Computer Interface (BCI) which uses Electroencephalography (EEG) and functional Near Infrared Spectroscopy (fNIRS) to discriminate and detect visual and auditory stimulus processing. We describe the experimental setup we used for collection of our data corpus with 12 subjects. On this data, we performed cross-validation evaluation, of which we report accuracy for different classification conditions. The results show that the subject-dependent systems achieved a classification accuracy of 97.8% for discriminating visual and auditory perception processes from each other and a classification accuracy of up to 94.8% for detecting modality specific processes independently of other cognitive activity. The same classification conditions could also be discriminated in a subject-independent fashion with accuracy of up to 94.6% and 86.7%, respectively. We also look at the contributions of the two signal types and show that the fusion of classifiers using different features significantly increases accuracy. © 2014 Putze, Hesslinger, Tse, Huang, Herff, Guan and Schultz. | Source Title: | Frontiers in Neuroscience | URI: | https://scholarbank.nus.edu.sg/handle/10635/176173 | ISSN: | 1662-4548 | DOI: | 10.3389/fnins.2014.00373 |
Appears in Collections: | Elements Staff Publications |
Show full item record
Files in This Item:
File | Description | Size | Format | Access Settings | Version | |
---|---|---|---|---|---|---|
10_3389_fnins_2014_00373.pdf | 3.2 MB | Adobe PDF | OPEN | None | View/Download |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.