Package | Description |
---|---|
org.bytedeco.tensorflow |
Modifier and Type | Method and Description |
---|---|
ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.Attrs.getPointer(long i) |
ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.Attrs.getPointer(long i) |
ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.Attrs.position(long position) |
ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.Attrs.position(long position) |
ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.Attrs.use_locking_(boolean setter) |
ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.Attrs.use_locking_(boolean setter) |
static ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.UseLocking(boolean x) |
ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.Attrs.UseLocking(boolean x)
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
|
static ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.UseLocking(boolean x) |
ResourceApplyAdagradDA.Attrs |
ResourceApplyAdagradDA.Attrs.UseLocking(boolean x)
If True, updating of the var and accum tensors will be protected by
a lock; otherwise the behavior is undefined, but may exhibit less contention.
|
Constructor and Description |
---|
ResourceApplyAdagradDA(Scope scope,
Input var,
Input gradient_accumulator,
Input gradient_squared_accumulator,
Input grad,
Input lr,
Input l1,
Input l2,
Input global_step,
ResourceApplyAdagradDA.Attrs attrs) |
ResourceApplyAdagradDA(Scope scope,
Input var,
Input gradient_accumulator,
Input gradient_squared_accumulator,
Input grad,
Input lr,
Input l1,
Input l2,
Input global_step,
ResourceApplyAdagradDA.Attrs attrs) |
Copyright © 2022. All rights reserved.