Caffe2 - Python API
A deep learning, cross platform ML framework
Public Member Functions | Public Attributes | List of all members
torch.optim.lr_scheduler.StepLR Class Reference
Inheritance diagram for torch.optim.lr_scheduler.StepLR:
torch.optim.lr_scheduler._LRScheduler test_optim.LegacyStepLR

Public Member Functions

def __init__ (self, optimizer, step_size, gamma=0.1, last_epoch=-1)
 
def get_lr (self)
 
- Public Member Functions inherited from torch.optim.lr_scheduler._LRScheduler
def __init__ (self, optimizer, last_epoch=-1)
 
def state_dict (self)
 
def load_state_dict (self, state_dict)
 
def get_lr (self)
 
def step (self, epoch=None)
 

Public Attributes

 step_size
 
 gamma
 
 last_epoch
 
- Public Attributes inherited from torch.optim.lr_scheduler._LRScheduler
 optimizer
 
 base_lrs
 
 last_epoch
 

Detailed Description

Decays the learning rate of each parameter group by gamma every
step_size epochs. Notice that such decay can happen simultaneously with
other changes to the learning rate from outside this scheduler. When
last_epoch=-1, sets initial lr as lr.

Args:
    optimizer (Optimizer): Wrapped optimizer.
    step_size (int): Period of learning rate decay.
    gamma (float): Multiplicative factor of learning rate decay.
        Default: 0.1.
    last_epoch (int): The index of last epoch. Default: -1.

Example:
    >>> # Assuming optimizer uses lr = 0.05 for all groups
    >>> # lr = 0.05     if epoch < 30
    >>> # lr = 0.005    if 30 <= epoch < 60
    >>> # lr = 0.0005   if 60 <= epoch < 90
    >>> # ...
    >>> scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
    >>> for epoch in range(100):
    >>>     scheduler.step()
    >>>     train(...)
    >>>     validate(...)

Definition at line 126 of file lr_scheduler.py.


The documentation for this class was generated from the following file: