:py:mod:`kosmos.ml.config.factories.lr_scheduler` ================================================= .. py:module:: kosmos.ml.config.factories.lr_scheduler Classes ------- .. py:class:: LearningRateSchedulerConfig Bases: :py:class:`abc.ABC` Learning rate scheduler configuration. | .. rubric:: Methods .. py:method:: get_instance(optimizer: torch.optim.Optimizer) -> torch.optim.lr_scheduler.LRScheduler Get the learning rate scheduler instance. :returns: Learning rate scheduler instance. :rtype: LRScheduler ---- .. py:class:: StepLearningRateSchedulerConfig(step_size: int, gamma: float = 0.1) Bases: :py:class:`LearningRateSchedulerConfig` Step learning rate scheduler configuration. Initialize the step learning rate scheduler configuration. :param step_size: Period of learning rate decay. :type step_size: int :param gamma: Multiplicative factor of learning rate decay. Defaults to 0.1. :type gamma: float | .. rubric:: Methods .. py:method:: get_instance(optimizer: torch.optim.Optimizer) -> torch.optim.lr_scheduler.StepLR Get the step learning rate scheduler instance. :param optimizer: Optimizer instance. :type optimizer: Optimizer :returns: Step learning rate scheduler instance. :rtype: StepLR ---- .. py:class:: ExponentialLearningRateSchedulerConfig(gamma: float) Bases: :py:class:`LearningRateSchedulerConfig` Exponential learning rate scheduler configuration. Initialize the exponential learning rate scheduler configuration. :param gamma: Multiplicative factor of learning rate decay. :type gamma: float | .. rubric:: Methods .. py:method:: get_instance(optimizer: torch.optim.Optimizer) -> torch.optim.lr_scheduler.ExponentialLR Get the exponential learning rate scheduler instance. :param optimizer: Optimizer instance. :type optimizer: Optimizer :returns: Exponential learning rate scheduler instance. :rtype: ExponentialLR ---- .. py:class:: CosineLearningRateSchedulerConfig(max_epochs: int, min_lr: float = 0.0) Bases: :py:class:`LearningRateSchedulerConfig` Cosine annealing learning rate scheduler configuration. Initialize the cosine learning rate scheduler configuration. :param max_epochs: Maximum number of epochs (iterations for the scheduler). :type max_epochs: int :param min_lr: Minimum learning rate. Defaults to 0.0. :type min_lr: float | .. rubric:: Methods .. py:method:: get_instance(optimizer: torch.optim.Optimizer) -> torch.optim.lr_scheduler.CosineAnnealingLR Get the cosine learning rate scheduler instance. :param optimizer: Optimizer instance. :type optimizer: Optimizer :returns: Cosine learning rate scheduler instance. :rtype: CosineAnnealingLR