Recurrent Neural Networks
torcheeg.models.GRU
- class torcheeg.models.GRU(num_electrodes: int = 32, hid_channels: int = 64, num_classes: int = 2)[source]
Bases:
Module
A simple but effective gate recurrent unit (GRU) network structure from the book of Zhang et al. For more details, please refer to the following information.
Book: Zhang X, Yao L. Deep Learning for EEG-Based Brain-Computer Interfaces: Representations, Algorithms and Applications[M]. 2021.
URL: https://www.worldscientific.com/worldscibooks/10.1142/q0282#t=aboutBook
Related Project: https://github.com/xiangzhang1015/Deep-Learning-for-BCI/blob/master/pythonscripts/4-1-2_GRU.py
Below is a recommended suite for use in emotion recognition tasks:
dataset = DEAPDataset(io_path=f'./deap', root_path='./data_preprocessed_python', online_transform=transforms.ToTensor(), label_transform=transforms.Compose([ transforms.Select('valence'), transforms.Binary(5.0), ])) model = GRU(num_electrodes=32, hid_channels=64, num_classes=2)
- Parameters
num_electrodes (int) – The number of electrodes, i.e., \(C\) in the paper. (defualt:
32
)hid_channels (int) – The number of hidden nodes in the GRU layers and the fully connected layer. (defualt:
64
)num_classes (int) – The number of classes to predict. (defualt:
2
)
- forward(x: Tensor) Tensor [source]
- Parameters
x (torch.Tensor) – EEG signal representation, the ideal input shape is
[n, 32, 128]
. Here,n
corresponds to the batch size,32
corresponds tonum_electrodes
, and128
corresponds to the number of data points included in the input EEG chunk.- Returns
the predicted probability that the samples belong to the classes.
- Return type
torch.Tensor[number of sample, number of classes]
- training: bool
torcheeg.models.LSTM
- class torcheeg.models.LSTM(num_electrodes: int = 32, hid_channels: int = 64, num_classes: int = 2)[source]
Bases:
Module
A simple but effective long-short term memory (LSTM) network structure from the book of Zhang et al. For more details, please refer to the following information.
Book: Zhang X, Yao L. Deep Learning for EEG-Based Brain-Computer Interfaces: Representations, Algorithms and Applications[M]. 2021.
URL: https://www.worldscientific.com/worldscibooks/10.1142/q0282#t=aboutBook
Related Project: https://github.com/xiangzhang1015/Deep-Learning-for-BCI/blob/master/pythonscripts/4-1-1_LSTM.py
Below is a recommended suite for use in emotion recognition tasks:
dataset = DEAPDataset(io_path=f'./deap', root_path='./data_preprocessed_python', online_transform=transforms.ToTensor(), label_transform=transforms.Compose([ transforms.Select('valence'), transforms.Binary(5.0), ])) model = GRU(num_electrodes=32, hid_channels=64, num_classes=2)
- Parameters
num_electrodes (int) – The number of electrodes, i.e., \(C\) in the paper. (defualt:
32
)hid_channels (int) – The number of hidden nodes in the GRU layers and the fully connected layer. (defualt:
64
)num_classes (int) – The number of classes to predict. (defualt:
2
)
- forward(x: Tensor) Tensor [source]
- Parameters
x (torch.Tensor) – EEG signal representation, the ideal input shape is
[n, 32, 128]
. Here,n
corresponds to the batch size,32
corresponds tonum_electrodes
, and128
corresponds to the number of data points included in the input EEG chunk.- Returns
the predicted probability that the samples belong to the classes.
- Return type
torch.Tensor[number of sample, number of classes]
- training: bool