kosmos.ml.config.factories.lr_scheduler¶
Classes¶
- class LearningRateSchedulerConfig[source]¶
Bases:
abc.ABCLearning rate scheduler configuration.
Methods
- get_instance(optimizer: torch.optim.Optimizer) torch.optim.lr_scheduler.LRScheduler[source]¶
Get the learning rate scheduler instance.
- Returns:
Learning rate scheduler instance.
- Return type:
LRScheduler
- class StepLearningRateSchedulerConfig(step_size: int, gamma: float = 0.1)[source]¶
Bases:
LearningRateSchedulerConfigStep learning rate scheduler configuration.
Initialize the step learning rate scheduler configuration.
- Parameters:
Methods
- get_instance(optimizer: torch.optim.Optimizer) torch.optim.lr_scheduler.StepLR[source]¶
Get the step learning rate scheduler instance.
- Parameters:
optimizer (Optimizer) – Optimizer instance.
- Returns:
Step learning rate scheduler instance.
- Return type:
StepLR
- class ExponentialLearningRateSchedulerConfig(gamma: float)[source]¶
Bases:
LearningRateSchedulerConfigExponential learning rate scheduler configuration.
Initialize the exponential learning rate scheduler configuration.
- Parameters:
gamma (float) – Multiplicative factor of learning rate decay.
Methods
- get_instance(optimizer: torch.optim.Optimizer) torch.optim.lr_scheduler.ExponentialLR[source]¶
Get the exponential learning rate scheduler instance.
- Parameters:
optimizer (Optimizer) – Optimizer instance.
- Returns:
Exponential learning rate scheduler instance.
- Return type:
ExponentialLR
- class CosineLearningRateSchedulerConfig(max_epochs: int, min_lr: float = 0.0)[source]¶
Bases:
LearningRateSchedulerConfigCosine annealing learning rate scheduler configuration.
Initialize the cosine learning rate scheduler configuration.
- Parameters:
Methods
- get_instance(optimizer: torch.optim.Optimizer) torch.optim.lr_scheduler.CosineAnnealingLR[source]¶
Get the cosine learning rate scheduler instance.
- Parameters:
optimizer (Optimizer) – Optimizer instance.
- Returns:
Cosine learning rate scheduler instance.
- Return type:
CosineAnnealingLR