Shortcuts

CORALTrainer

class torcheeg.trainers.CORALTrainer(extractor: Module, classifier: Module, num_classes: int, lr: float = 0.0001, weight_decay: float = 0.0, weight_domain: float = 1.0, weight_scheduler: bool = True, lr_scheduler_gamma: float = 0.0, lr_scheduler_decay: float = 0.75, warmup_epochs: int = 0, devices: int = 1, accelerator: str = 'cpu', metrics: List[str] = ['accuracy'])[source][source]

This class supports the implementation of CORrelation ALignment (CORAL) for deep domain adaptation.

NOTE: CORAL belongs to unsupervised domain adaptation methods, which only use labeled source data and unlabeled target data. This means that the target dataset does not have to contain labels.

from torcheeg.models import CCNN
from torcheeg.trainers import CORALTrainer

class Extractor(CCNN):
    def forward(self, x):
        x = self.conv1(x)
        x = self.conv2(x)
        x = self.conv3(x)
        x = self.conv4(x)
        x = x.flatten(start_dim=1)
        return x

class Classifier(CCNN):
    def forward(self, x):
        x = self.lin1(x)
        x = self.lin2(x)
        return x

extractor = Extractor(in_channels=5, num_classes=3)
classifier = Classifier(in_channels=5, num_classes=3)

trainer = CORALTrainer(extractor,
                       classifier,
                       num_classes=3,
                       devices=1,
                       weight_domain=1.0,
                       accelerator='gpu')
Parameters:
  • extractor (nn.Module) – The feature extraction model learns the feature representation of the EEG signal by forcing the correlation matrixes of source and target data to be close.

  • classifier (nn.Module) – The classification model learns the classification task with the source labeled data based on the feature of the feature extraction model. The dimension of its output should be equal to the number of categories in the dataset. The output layer does not need to have a softmax activation function.

  • num_classes (int, optional) – The number of categories in the dataset. (default: None)

  • lr (float) – The learning rate. (default: 0.0001)

  • weight_decay (float) – The weight decay. (default: 0.0)

  • weight_domain (float) – The weight of the CORAL loss. (default: 1.0)

  • weight_scheduler (bool) – Whether to use a scheduler for the weight of the CORAL loss, growing from 0 to 1 following the schedule from the DANN paper. (default: False)

  • lr_scheduler (bool) – Whether to use a scheduler for the learning rate, as defined in the DANN paper. (default: False)

  • warmup_epochs (int) – The number of epochs for the warmup phase, during which the weight of the CORAL loss is 0. (default: 0)

  • devices (int) – The number of devices to use. (default: 1)

  • accelerator (str) – The accelerator to use. Available options are: ‘cpu’, ‘gpu’. (default: "cpu")

  • metrics (list of str) – The metrics to use. Available options are: ‘precision’, ‘recall’, ‘f1score’, ‘accuracy’, ‘matthews’, ‘auroc’, and ‘kappa’. (default: ["accuracy"])

fit(source_loader: DataLoader, target_loader: DataLoader, val_loader: DataLoader, max_epochs: int = 300, *args, **kwargs)[source]
Parameters:
  • source_loader (DataLoader) – Iterable DataLoader for traversing the data batch from the source domain (torch.utils.data.dataloader.DataLoader, torch_geometric.loader.DataLoader, etc).

  • target_loader (DataLoader) – Iterable DataLoader for traversing the training data batch from the target domain (torch.utils.data.dataloader.DataLoader, torch_geometric.loader.DataLoader, etc). The target dataset does not have to return labels.

  • val_loader (DataLoader) – Iterable DataLoader for traversing the validation data batch (torch.utils.data.dataloader.DataLoader, torch_geometric.loader.DataLoader, etc).

  • max_epochs (int) – The maximum number of epochs to train. (default: 300)

test(test_loader: DataLoader, *args, **kwargs) List[Dict[str, float]][source]
Parameters:

test_loader (DataLoader) – Iterable DataLoader for traversing the test data batch (torch.utils.data.dataloader.DataLoader, torch_geometric.loader.DataLoader, etc).

Read the Docs v: latest
Versions
latest
stable
v1.1.2
v1.1.1
v1.1.0
v1.0.11
v1.0.10
v1.0.9
v1.0.8.post1
v1.0.8
v1.0.7
v1.0.6
v1.0.4
v1.0.3
v1.0.2
v1.0.1
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources