kosmos.ml.config.factories.optimizer¶
Module Attributes¶
- type ParamsT = Iterable[torch.Tensor] | Iterable[dict[str, Any]] | Iterable[tuple[str, torch.Tensor]]¶
Classes¶
- class OptimizerConfig¶
Bases:
abc.ABCOptimizer configuration.
Methods
- get_instance(params: ParamsT) torch.optim.Optimizer¶
Get the optimizer instance.
- Parameters:
params (ParamsT) – Parameters to optimize.
- Returns:
Optimizer instance.
- Return type:
Optimizer
- class SGDOptimizerConfig(lr: float = 0.001, momentum: float = 0.0, weight_decay: float = 0.0, *, nesterov: bool = False)¶
Bases:
OptimizerConfigStochastic gradient descent (SGD) optimizer configuration.
Initialize the SGD optimizer configuration.
- Parameters:
Methods
- get_instance(params: ParamsT) torch.optim.SGD¶
Get the SGD optimizer instance.
- Parameters:
params (ParamsT) – Parameters to optimize.
- Returns:
SGD optimizer instance.
- Return type:
SGD
- class AdamOptimizerConfig(lr: float = 0.001, weight_decay: float = 0.0)¶
Bases:
OptimizerConfigAdam optimizer configuration.
Initialize the Adam optimizer configuration.
- Parameters:
Methods
- get_instance(params: ParamsT) torch.optim.Adam¶
Get the Adam optimizer instance.
- Parameters:
params (ParamsT) – Parameters to optimize.
- Returns:
Adam optimizer instance.
- Return type:
Adam