linear_warmup
optimus_dl.modules.lr_scheduler.linear_warmup
¶
LinearWarmupLR
¶
Bases: BaseLRScheduler
Linear warmup learning rate scheduler.
Linearly increases the learning rate from start_lr to target_lr over a
specified number of steps. Once the warmup phase is complete, the learning
rate is held constant at target_lr.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
cfg
|
LinearWarmupLRConfig
|
Scheduler configuration. |
required |
optimizer
|
Optimizer
|
Managed optimizer. |
required |
iterations
|
int
|
Total training iterations (used to calculate warmup steps if configured by percentage). |
required |
Source code in optimus_dl/modules/lr_scheduler/linear_warmup.py
get_lr()
¶
Calculate learning rates using the linear warmup formula.
Source code in optimus_dl/modules/lr_scheduler/linear_warmup.py
load_state_dict(state_dict)
¶
Restore the scheduler's state.
Source code in optimus_dl/modules/lr_scheduler/linear_warmup.py
state_dict()
¶
Return the scheduler's state, including warmup-specific parameters.
Source code in optimus_dl/modules/lr_scheduler/linear_warmup.py
LinearWarmupLRConfig
dataclass
¶
Bases: BaseLRSchedulerConfig
Configuration for linear warmup learning rate scheduler.
Attributes:
| Name | Type | Description |
|---|
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
warmup_steps
|
int | None
|
|
None
|
warmup_percent
|
float | None
|
|
0.05
|
target_lr
|
float | None
|
|
None
|
start_lr
|
float
|
|
0.0
|