algorithm_factory
Convenience function to create algorithm configurations.
algorithm_factory(algorithm) #
Create an algorithm model for training CAREamics.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
algorithm | dict | Algorithm dictionary. | required |
Returns:
| Type | Description |
|---|---|
N2VAlgorithm or N2NAlgorithm or CAREAlgorithm | Algorithm model for training CAREamics. |
Source code in src/careamics/config/ng_factories/algorithm_factory.py
create_algorithm_configuration(dimensions, algorithm, loss, independent_channels, n_channels_in, n_channels_out, use_n2v2=False, model_params=None, optimizer='Adam', optimizer_params=None, lr_scheduler='ReduceLROnPlateau', lr_scheduler_params=None) #
Create a dictionary with the parameters of the algorithm model.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
dimensions | (2, 3) | Dimension of the model, either 2D or 3D. | 2 |
algorithm | (n2v, care, n2n) | Algorithm to use. | "n2v" |
loss | (n2v, mae, mse) | Loss function to use. | "n2v" |
independent_channels | bool | Whether to train all channels independently. | required |
n_channels_in | int | Number of input channels. | required |
n_channels_out | int | Number of output channels. | required |
use_n2v2 | bool | Whether to use N2V2. | false |
model_params | dict | UNetModel parameters. | None |
optimizer | (Adam, Adamax, SGD) | Optimizer to use. | "Adam" |
optimizer_params | dict | Parameters for the optimizer, see PyTorch documentation for more details. | None |
lr_scheduler | (ReduceLROnPlateau, StepLR) | Learning rate scheduler to use. | "ReduceLROnPlateau" |
lr_scheduler_params | dict | Parameters for the learning rate scheduler, see PyTorch documentation for more details. | None |
Returns:
| Type | Description |
|---|---|
dict | Algorithm model as dictionnary with the specified parameters. |