@Operator(group="train") public final class ResourceApplyProximalAdagrad extends PrimitiveOp
accum += grad grad prox_v = var - lr grad (1 / sqrt(accum)) var = sign(prox_v)/(1+lrl2) max{|prox_v|-lrl1,0}
Modifier and Type | Class and Description |
---|---|
static class |
ResourceApplyProximalAdagrad.Options
Optional attributes for
ResourceApplyProximalAdagrad |
operation
Modifier and Type | Method and Description |
---|---|
static <T> ResourceApplyProximalAdagrad |
create(Scope scope,
Operand<?> var,
Operand<?> accum,
Operand<T> lr,
Operand<T> l1,
Operand<T> l2,
Operand<T> grad,
ResourceApplyProximalAdagrad.Options... options)
Factory method to create a class wrapping a new ResourceApplyProximalAdagrad operation.
|
static ResourceApplyProximalAdagrad.Options |
useLocking(Boolean useLocking) |
equals, hashCode, op, toString
public static <T> ResourceApplyProximalAdagrad create(Scope scope, Operand<?> var, Operand<?> accum, Operand<T> lr, Operand<T> l1, Operand<T> l2, Operand<T> grad, ResourceApplyProximalAdagrad.Options... options)
scope
- current scopevar
- Should be from a Variable().accum
- Should be from a Variable().lr
- Scaling factor. Must be a scalar.l1
- L1 regularization. Must be a scalar.l2
- L2 regularization. Must be a scalar.grad
- The gradient.options
- carries optional attributes valuespublic static ResourceApplyProximalAdagrad.Options useLocking(Boolean useLocking)
useLocking
- If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.Copyright © 2022. All rights reserved.