Caffe2 - Python API
A deep learning, cross platform ML framework
Public Member Functions | Public Attributes | List of all members
torch.optim.lr_scheduler.MultiStepLR Class Reference
Inheritance diagram for torch.optim.lr_scheduler.MultiStepLR:
torch.optim.lr_scheduler._LRScheduler test_optim.LegacyMultiStepLR

Public Member Functions

def __init__ (self, optimizer, milestones, gamma=0.1, last_epoch=-1)
 
def get_lr (self)
 
- Public Member Functions inherited from torch.optim.lr_scheduler._LRScheduler
def __init__ (self, optimizer, last_epoch=-1)
 
def state_dict (self)
 
def load_state_dict (self, state_dict)
 
def get_lr (self)
 
def step (self, epoch=None)
 

Public Attributes

 milestones
 
 gamma
 
- Public Attributes inherited from torch.optim.lr_scheduler._LRScheduler
 optimizer
 
 base_lrs
 
 last_epoch
 

Detailed Description

Decays the learning rate of each parameter group by gamma once the
number of epoch reaches one of the milestones. Notice that such decay can
happen simultaneously with other changes to the learning rate from outside
this scheduler. When last_epoch=-1, sets initial lr as lr.

Args:
    optimizer (Optimizer): Wrapped optimizer.
    milestones (list): List of epoch indices. Must be increasing.
    gamma (float): Multiplicative factor of learning rate decay.
        Default: 0.1.
    last_epoch (int): The index of last epoch. Default: -1.

Example:
    >>> # Assuming optimizer uses lr = 0.05 for all groups
    >>> # lr = 0.05     if epoch < 30
    >>> # lr = 0.005    if 30 <= epoch < 80
    >>> # lr = 0.0005   if epoch >= 80
    >>> scheduler = MultiStepLR(optimizer, milestones=[30,80], gamma=0.1)
    >>> for epoch in range(100):
    >>>     scheduler.step()
    >>>     train(...)
    >>>     validate(...)

Definition at line 164 of file lr_scheduler.py.


The documentation for this class was generated from the following file: