Shortcuts

torcheeg.trainers

Basic Classification

ClassifierTrainer

A generic trainer class for EEG classification.

Cross-domain Classification

The individual differences and nonstationary of EEG signals make it difficult for deep learning models trained on the training set of subjects to correctly classify test samples from unseen subjects, since the training set and test set come from different data distributions. Domain adaptation is used to address the problem of distribution drift between training and test sets and thus achieves good performance in subject-independent (cross-subject) scenarios.

CORALTrainer

This class supports the implementation of CORrelation ALignment (CORAL) for deep domain adaptation.

DDCTrainer

The individual differences and nonstationary of EEG signals make it difficult for deep learning models trained on the training set of subjects to correctly classify test samples from unseen subjects, since the training set and test set come from different data distributions.

DANTrainer

This class supports the implementation of Deep Adaptation Network (DAN) for deep domain adaptation.

JANTrainer

This class supports the implementation of Joint Adaptation Networks (JAN) for deep domain adaptation.

ADATrainer

This class supports the implementation of Associative Domain Adaptation (ADA) for deep domain adaptation.

DANNTrainer

The individual differences and nonstationary of EEG signals make it difficult for deep learning models trained on the training set of subjects to correctly classify test samples from unseen subjects, since the training set and test set come from different data distributions.

CenterLossTrainer

A trainer trains classification model contains a extractor and a classifier.

Imbalance Learning for Classification

EEG emotion datasets have the problem of sample class imbalance, and imbalance learning can be used to solve the class imbalance problem in emotion recognition tasks.

LALossTrainer

A trainer class for EEG classification with Logit-adjusted (LA) loss for imbalanced datasets.

LDAMLossTrainer

A trainer class for EEG classification with Label-distribution-aware margin (LDAM) loss for imbalanced datasets.

EQLossTrainer

A trainer class for EEG classification with Equalization (EQ) loss for imbalanced datasets.

FocalLossTrainer

A trainer class for EEG classification with Focal loss for imbalanced datasets.

WCELossTrainer

A trainer class for EEG classification with Weighted Cross Entropy (WCE) loss for imbalanced datasets.

EEG Generation

Data scarcity and data imbalance are one of the important challenges in the analysis of EEG signals. TorchEEG provides different types of generative model trainers to help train generative models to augment EEG datasets. The trainer starting with “C” represents the trainer of the category conditioned generative models, allowing the user to control the category of the generated EEG signal. The others generate samples close to real EEG signals randomly without controlling the category.

BetaVAETrainer

This class provide the implementation for BetaVAE training.

CBetaVAETrainer

This class provide the implementation for BetaVAE training.

WGANGPTrainer

This class provide the implementation for WGAN-GP.

CWGANGPTrainer

This class provide the implementation for WGAN-GP.

Self-supervised Algorithm for Pre-training

As the cost of data collection decreases, the difficulty of obtaining unlabeled data is greatly reduced. How to use unlabeled data to train the model so that the model can learn task-independent general knowledge on large-scale datasets has attracted extensive attention. In natural language processing and computer vision, self-supervised models have made continuous progress by building pretext tasks to learn good language or visual representations. Today, self-supervised learning algorithms are also being tested for EEG analysis to train larger models.

SimCLRTrainer

This class supports the implementation of A Simple Framework for Contrastive Learning of Visual Representations (SimCLR) for self-supervised pre-training.

BYOLTrainer

This class supports the implementation of Bootstrap Your Own Latent (BYOL) for self-supervised pre-training.

Read the Docs v: latest
Versions
latest
stable
v1.1.1
v1.1.0
v1.0.11
v1.0.10
v1.0.9
v1.0.8.post1
v1.0.8
v1.0.7
v1.0.6
v1.0.4
v1.0.3
v1.0.2
v1.0.1
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources