Public Member Functions | |
| def | __init__ (self, params, lr=1.0, rho=0.9, eps=1e-6, weight_decay=0) |
| def | step (self, closure=None) |
Implements Adadelta algorithm.
It has been proposed in `ADADELTA: An Adaptive Learning Rate Method`__.
Arguments:
params (iterable): iterable of parameters to optimize or dicts defining
parameter groups
rho (float, optional): coefficient used for computing a running average
of squared gradients (default: 0.9)
eps (float, optional): term added to the denominator to improve
numerical stability (default: 1e-6)
lr (float, optional): coefficient that scale delta before it is applied
to the parameters (default: 1.0)
weight_decay (float, optional): weight decay (L2 penalty) (default: 0)
__ https://arxiv.org/abs/1212.5701
Definition at line 6 of file adadelta.py.
| def torch.optim.adadelta.Adadelta.step | ( | self, | |
closure = None |
|||
| ) |
Performs a single optimization step.
Arguments:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
Definition at line 38 of file adadelta.py.
1.8.11