@Operator(group="train") public final class ResourceApplyMomentum extends PrimitiveOp
want to use Nesterov momentum.
accum = accum * momentum + grad var -= lr * accum
Modifier and Type | Class and Description |
---|---|
static class |
ResourceApplyMomentum.Options
Optional attributes for
ResourceApplyMomentum |
operation
Modifier and Type | Method and Description |
---|---|
static <T> ResourceApplyMomentum |
create(Scope scope,
Operand<?> var,
Operand<?> accum,
Operand<T> lr,
Operand<T> grad,
Operand<T> momentum,
ResourceApplyMomentum.Options... options)
Factory method to create a class wrapping a new ResourceApplyMomentum operation.
|
static ResourceApplyMomentum.Options |
useLocking(Boolean useLocking) |
static ResourceApplyMomentum.Options |
useNesterov(Boolean useNesterov) |
equals, hashCode, op, toString
public static <T> ResourceApplyMomentum create(Scope scope, Operand<?> var, Operand<?> accum, Operand<T> lr, Operand<T> grad, Operand<T> momentum, ResourceApplyMomentum.Options... options)
scope
- current scopevar
- Should be from a Variable().accum
- Should be from a Variable().lr
- Scaling factor. Must be a scalar.grad
- The gradient.momentum
- Momentum. Must be a scalar.options
- carries optional attributes valuespublic static ResourceApplyMomentum.Options useLocking(Boolean useLocking)
useLocking
- If `True`, updating of the var and accum tensors will be protected
by a lock; otherwise the behavior is undefined, but may exhibit less
contention.public static ResourceApplyMomentum.Options useNesterov(Boolean useNesterov)
useNesterov
- If `True`, the tensor passed to compute grad will be
var - lr * momentum * accum, so in the end, the var you get is actually
var - lr * momentum * accum.Copyright © 2022. All rights reserved.