Skip to content

hyperparameters_callback

Callback saving CAREamics configuration as hyperparameters in the model.

HyperParametersCallback #

Bases: Callback

Callback allowing saving CAREamics configuration as hyperparameters in the model.

This allows saving the configuration as dictionary in the checkpoints, and loading it subsequently in a CAREamist instance.

Parameters:

Name Type Description Default
config Configuration

CAREamics configuration to be saved as hyperparameter in the model.

required

Attributes:

Name Type Description
config Configuration

CAREamics configuration to be saved as hyperparameter in the model.

Source code in src/careamics/lightning/callbacks/hyperparameters_callback.py
class HyperParametersCallback(Callback):
    """
    Callback allowing saving CAREamics configuration as hyperparameters in the model.

    This allows saving the configuration as dictionary in the checkpoints, and
    loading it subsequently in a CAREamist instance.

    Parameters
    ----------
    config : Configuration
        CAREamics configuration to be saved as hyperparameter in the model.

    Attributes
    ----------
    config : Configuration
        CAREamics configuration to be saved as hyperparameter in the model.
    """

    def __init__(self, config: Configuration) -> None:
        """
        Constructor.

        Parameters
        ----------
        config : Configuration
            CAREamics configuration to be saved as hyperparameter in the model.
        """
        self.config = config

    def on_train_start(self, trainer: Trainer, pl_module: LightningModule) -> None:
        """
        Update the hyperparameters of the model with the configuration on train start.

        Parameters
        ----------
        trainer : Trainer
            PyTorch Lightning trainer, unused.
        pl_module : LightningModule
            PyTorch Lightning module.
        """
        pl_module.hparams.update(self.config.model_dump())

__init__(config) #

Constructor.

Parameters:

Name Type Description Default
config Configuration

CAREamics configuration to be saved as hyperparameter in the model.

required
Source code in src/careamics/lightning/callbacks/hyperparameters_callback.py
def __init__(self, config: Configuration) -> None:
    """
    Constructor.

    Parameters
    ----------
    config : Configuration
        CAREamics configuration to be saved as hyperparameter in the model.
    """
    self.config = config

on_train_start(trainer, pl_module) #

Update the hyperparameters of the model with the configuration on train start.

Parameters:

Name Type Description Default
trainer Trainer

PyTorch Lightning trainer, unused.

required
pl_module LightningModule

PyTorch Lightning module.

required
Source code in src/careamics/lightning/callbacks/hyperparameters_callback.py
def on_train_start(self, trainer: Trainer, pl_module: LightningModule) -> None:
    """
    Update the hyperparameters of the model with the configuration on train start.

    Parameters
    ----------
    trainer : Trainer
        PyTorch Lightning trainer, unused.
    pl_module : LightningModule
        PyTorch Lightning module.
    """
    pl_module.hparams.update(self.config.model_dump())