Author
Listed:
- Britta U Westner
- Sarang S Dalal
- Simon Hanslmayr
- Tobias Staudigl
Abstract
Single-trial analyses have the potential to uncover meaningful brain dynamics that are obscured when averaging across trials. However, low signal-to-noise ratio (SNR) can impede the use of single-trial analyses and decoding methods. In this study, we investigate the applicability of a single-trial approach to decode stimulus modality from magnetoencephalographic (MEG) high frequency activity. In order to classify the auditory versus visual presentation of words, we combine beamformer source reconstruction with the random forest classification method. To enable group level inference, the classification is embedded in an across-subjects framework. We show that single-trial gamma SNR allows for good classification performance (accuracy across subjects: 66.44%). This implies that the characteristics of high frequency activity have a high consistency across trials and subjects. The random forest classifier assigned informational value to activity in both auditory and visual cortex with high spatial specificity. Across time, gamma power was most informative during stimulus presentation. Among all frequency bands, the 75 Hz to 95 Hz band was the most informative frequency band in visual as well as in auditory areas. Especially in visual areas, a broad range of gamma frequencies (55 Hz to 125 Hz) contributed to the successful classification. Thus, we demonstrate the feasibility of single-trial approaches for decoding the stimulus modality across subjects from high frequency activity and describe the discriminative gamma activity in time, frequency, and space.Author summary: Averaging brain activity across trials is a powerful way to increase signal-to-noise ratio in MEG data. This approach, however, potentially obscures meaningful brain dynamics that unfold on the single-trial level. Single-trial analyses have been successfully applied to time domain or low frequency oscillatory activity; its application to MEG high frequency activity is hindered by the low amplitude of these signals. In the present study, we show that stimulus modality (visual versus auditory presentation of words) can successfully be decoded from single-trial MEG high frequency activity by combining source reconstruction with a random forest classification algorithm. This approach reveals patterns of activity above 75 Hz in both visual and auditory cortex, highlighting the importance of high frequency activity for the processing of domain-specific stimuli. Thereby, our results extend prior findings by revealing high-frequency activity in auditory cortex related to auditory word stimuli in MEG data. The adopted across-subjects framework furthermore suggests a high inter-individual consistency in the high frequency activity patterns.
Suggested Citation
Britta U Westner & Sarang S Dalal & Simon Hanslmayr & Tobias Staudigl, 2018.
"Across-subjects classification of stimulus modality from human MEG high frequency activity,"
PLOS Computational Biology, Public Library of Science, vol. 14(3), pages 1-14, March.
Handle:
RePEc:plo:pcbi00:1005938
DOI: 10.1371/journal.pcbi.1005938
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1005938. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.