base
optimus_dl.modules.lr_scheduler.base
¶
BaseLRScheduler
¶
Bases: ABC
Abstract base class for learning rate schedulers.
This class provides a uniform interface for learning rate scheduling that is decoupled from specific optimizer implementations. It manages the stepping of learning rates across multiple parameter groups and handles state serialization for checkpointing.
Attributes:
| Name | Type | Description |
|---|---|---|
optimizer |
The PyTorch optimizer whose learning rates are managed. |
|
base_lrs |
Initial learning rates for each parameter group. |
Source code in optimus_dl/modules/lr_scheduler/base.py
last_epoch
property
¶
The current step count (for compatibility with PyTorch schedulers).
__init__(optimizer, **kwargs)
¶
Initialize the scheduler.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
optimizer
|
Optimizer
|
The optimizer to manage. |
required |
**kwargs
|
Additional keyword arguments. |
{}
|
Source code in optimus_dl/modules/lr_scheduler/base.py
get_last_lr()
¶
get_lr()
abstractmethod
¶
Calculate the target learning rates for the current step.
Returns:
| Type | Description |
|---|---|
list[float]
|
List of floats representing the new learning rates for each |
list[float]
|
parameter group in the optimizer. |
Source code in optimus_dl/modules/lr_scheduler/base.py
load_state_dict(state_dict)
¶
Restore the scheduler's state from a checkpoint.
set()
¶
Set the learning rates of the optimizer to the current values.
state_dict()
¶
step()
¶
Update the optimizer's learning rates based on the current step count.
This should be called at the end of each training iteration.