Project Details
Description
This study utilizes a Brain-Computer Interface (BCI) to recognize Motor Imagery (MI). We developed a game in a Virtual Reality (VR) environment to collect signals when subjects imagine movements of their left hand, right hand, and being in a rest state. The developed neural network performs real-time detection of these brainwave intentions. This research developed a neural network with an attention mechanism, utilizing a Convolutional Neural Network (CNN) in the time-frequency space to extract features from the brainwave signals associated with imagined movements. This neural network captures Electroencephalography (EEG) signals 1 second before each imagined movement for action recognition. We adopted an attention mechanism architecture to process EEG signals, and thanks to its self-attention mechanism's strong adaptability to sequential data, the Transformer not only effectively eliminated noise but also successfully extracted key features. Subsequently, these Transformer-processed signals were inputted into EEGNet for a three-class classification. In tests conducted with eight participants, our model achieved an average accuracy of 71.32%, marking a 9.17% improvement over methods that only used EEGNet. This outcome not only demonstrates the potential of combining the Transformer and EEGNet in BCI applications but also offers a new direction for future research.
Status | Finished |
---|---|
Effective start/end date | 1/08/23 → 31/07/24 |
Keywords
- Attention Mechanism
- EEGNet
- Motor Imagery
- Brainwaves
- Real-Time Recognition
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.