GRU¶
- class torcheeg.models.GRU(num_electrodes: int = 32, hid_channels: int = 64, num_classes: int = 2)[source][source]¶
A simple but effective gate recurrent unit (GRU) network structure from the book of Zhang et al. For more details, please refer to the following information.
Book: Zhang X, Yao L. Deep Learning for EEG-Based Brain-Computer Interfaces: Representations, Algorithms and Applications[M]. 2021.
URL: https://www.worldscientific.com/worldscibooks/10.1142/q0282#t=aboutBook
Related Project: https://github.com/xiangzhang1015/Deep-Learning-for-BCI/blob/master/pythonscripts/4-1-2_GRU.py
Below is a recommended suite for use in emotion recognition tasks:
from torcheeg.datasets import DEAPDataset from torcheeg import transforms from torcheeg.models import GRU from torch.utils.data import DataLoader dataset = DEAPDataset(root_path='./data_preprocessed_python', online_transform=transforms.ToTensor(), label_transform=transforms.Compose([ transforms.Select('valence'), transforms.Binary(5.0), ])) model = GRU(num_electrodes=32, hid_channels=64, num_classes=2) x, y = next(iter(DataLoader(dataset, batch_size=64))) model(x)
- Parameters:
num_electrodes (int) – The number of electrodes, i.e., \(C\) in the paper. (default:
32
)hid_channels (int) – The number of hidden nodes in the GRU layers and the fully connected layer. (default:
64
)num_classes (int) – The number of classes to predict. (default:
2
)
- forward(x: Tensor) Tensor [source][source]¶
- Parameters:
x (torch.Tensor) – EEG signal representation, the ideal input shape is
[n, 32, 128]
. Here,n
corresponds to the batch size,32
corresponds tonum_electrodes
, and128
corresponds to the number of data points included in the input EEG chunk.- Returns:
the predicted probability that the samples belong to the classes.
- Return type:
torch.Tensor[number of sample, number of classes]