load_checkpoint
Module for loading CAREamics models and configs from a checkpoint.
load_config_from_checkpoint(checkpoint_path) #
Load a CAREamics config from a checkpoint.
Some fields, if missing, will be populated by defaults. Namely, version, training_config and experiment_name.
The default for experiment_name will be "loaded_from_<checkpoint_filename>".
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
checkpoint_path | Path | Path to the PyTorch Lightning checkpoint file. | required |
Returns:
| Type | Description |
|---|---|
Configuration | A CAREamics configuration object. |
Raises:
| Type | Description |
|---|---|
ValueErrors: | If certain required information is not found in the checkpoint. |
Source code in src/careamics/lightning/dataset_ng/load_checkpoint.py
load_module_from_checkpoint(checkpoint_path) #
Load a trained CAREamics module from checkpoint.
Automatically detects the algorithm type from the checkpoint and loads the appropriate module with trained weights.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
checkpoint_path | Path | Path to the PyTorch Lightning checkpoint file. | required |
Returns:
| Type | Description |
|---|---|
CAREamicsModule | Lightning module with loaded weights. |
Raises:
| Type | Description |
|---|---|
ValueError | If the algorithm type cannot be determined from the checkpoint. |