Class | Description |
---|---|
AccumulatorApplyGradient |
Applies a gradient to a given accumulator.
|
AccumulatorNumAccumulated |
Returns the number of gradients aggregated in the given accumulators.
|
AccumulatorSetGlobalStep |
Updates the accumulator with a new value for global_step.
|
AccumulatorTakeGradient<T> |
Extracts the average gradient in the given ConditionalAccumulator.
|
ApplyAdadelta<T> |
Update '*var' according to the adadelta scheme.
|
ApplyAdadelta.Options |
Optional attributes for
ApplyAdadelta |
ApplyAdagrad<T> |
Update '*var' according to the adagrad scheme.
|
ApplyAdagrad.Options |
Optional attributes for
ApplyAdagrad |
ApplyAdagradDa<T> |
Update '*var' according to the proximal adagrad scheme.
|
ApplyAdagradDa.Options |
Optional attributes for
ApplyAdagradDa |
ApplyAdam<T> |
Update '*var' according to the Adam algorithm.
|
ApplyAdam.Options |
Optional attributes for
ApplyAdam |
ApplyAdaMax<T> |
Update '*var' according to the AdaMax algorithm.
|
ApplyAdaMax.Options |
Optional attributes for
ApplyAdaMax |
ApplyAddSign<T> |
Update '*var' according to the AddSign update.
|
ApplyAddSign.Options |
Optional attributes for
ApplyAddSign |
ApplyCenteredRmsProp<T> |
Update '*var' according to the centered RMSProp algorithm.
|
ApplyCenteredRmsProp.Options |
Optional attributes for
ApplyCenteredRmsProp |
ApplyFtrl<T> |
Update '*var' according to the Ftrl-proximal scheme.
|
ApplyFtrl.Options |
Optional attributes for
ApplyFtrl |
ApplyGradientDescent<T> |
Update '*var' by subtracting 'alpha' * 'delta' from it.
|
ApplyGradientDescent.Options |
Optional attributes for
ApplyGradientDescent |
ApplyMomentum<T> |
Update '*var' according to the momentum scheme.
|
ApplyMomentum.Options |
Optional attributes for
ApplyMomentum |
ApplyPowerSign<T> |
Update '*var' according to the AddSign update.
|
ApplyPowerSign.Options |
Optional attributes for
ApplyPowerSign |
ApplyProximalAdagrad<T> |
Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
|
ApplyProximalAdagrad.Options |
Optional attributes for
ApplyProximalAdagrad |
ApplyProximalGradientDescent<T> |
Update '*var' as FOBOS algorithm with fixed learning rate.
|
ApplyProximalGradientDescent.Options |
Optional attributes for
ApplyProximalGradientDescent |
ApplyRmsProp<T> |
Update '*var' according to the RMSProp algorithm.
|
ApplyRmsProp.Options |
Optional attributes for
ApplyRmsProp |
ConditionalAccumulator |
A conditional accumulator for aggregating gradients.
|
ConditionalAccumulator.Options |
Optional attributes for
ConditionalAccumulator |
GenerateVocabRemapping |
Given a path to new and old vocabulary files, returns a remapping Tensor of
|
GenerateVocabRemapping.Options |
Optional attributes for
GenerateVocabRemapping |
MergeV2Checkpoints |
V2 format specific: merges the metadata files of sharded checkpoints.
|
MergeV2Checkpoints.Options |
Optional attributes for
MergeV2Checkpoints |
NegTrain |
Training via negative sampling.
|
PreventGradient<T> |
An identity op that triggers an error if a gradient is requested.
|
PreventGradient.Options |
Optional attributes for
PreventGradient |
ResourceApplyAdadelta |
Update '*var' according to the adadelta scheme.
|
ResourceApplyAdadelta.Options |
Optional attributes for
ResourceApplyAdadelta |
ResourceApplyAdagrad |
Update '*var' according to the adagrad scheme.
|
ResourceApplyAdagrad.Options |
Optional attributes for
ResourceApplyAdagrad |
ResourceApplyAdagradDa |
Update '*var' according to the proximal adagrad scheme.
|
ResourceApplyAdagradDa.Options |
Optional attributes for
ResourceApplyAdagradDa |
ResourceApplyAdam |
Update '*var' according to the Adam algorithm.
|
ResourceApplyAdam.Options |
Optional attributes for
ResourceApplyAdam |
ResourceApplyAdaMax |
Update '*var' according to the AdaMax algorithm.
|
ResourceApplyAdaMax.Options |
Optional attributes for
ResourceApplyAdaMax |
ResourceApplyAddSign |
Update '*var' according to the AddSign update.
|
ResourceApplyAddSign.Options |
Optional attributes for
ResourceApplyAddSign |
ResourceApplyCenteredRmsProp |
Update '*var' according to the centered RMSProp algorithm.
|
ResourceApplyCenteredRmsProp.Options |
Optional attributes for
ResourceApplyCenteredRmsProp |
ResourceApplyFtrl |
Update '*var' according to the Ftrl-proximal scheme.
|
ResourceApplyFtrl.Options |
Optional attributes for
ResourceApplyFtrl |
ResourceApplyGradientDescent |
Update '*var' by subtracting 'alpha' * 'delta' from it.
|
ResourceApplyGradientDescent.Options |
Optional attributes for
ResourceApplyGradientDescent |
ResourceApplyMomentum |
Update '*var' according to the momentum scheme.
|
ResourceApplyMomentum.Options |
Optional attributes for
ResourceApplyMomentum |
ResourceApplyPowerSign |
Update '*var' according to the AddSign update.
|
ResourceApplyPowerSign.Options |
Optional attributes for
ResourceApplyPowerSign |
ResourceApplyProximalAdagrad |
Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
|
ResourceApplyProximalAdagrad.Options |
Optional attributes for
ResourceApplyProximalAdagrad |
ResourceApplyProximalGradientDescent |
Update '*var' as FOBOS algorithm with fixed learning rate.
|
ResourceApplyProximalGradientDescent.Options |
Optional attributes for
ResourceApplyProximalGradientDescent |
ResourceApplyRmsProp |
Update '*var' according to the RMSProp algorithm.
|
ResourceApplyRmsProp.Options |
Optional attributes for
ResourceApplyRmsProp |
ResourceSparseApplyAdadelta |
var: Should be from a Variable().
|
ResourceSparseApplyAdadelta.Options |
Optional attributes for
ResourceSparseApplyAdadelta |
ResourceSparseApplyAdagrad |
Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
|
ResourceSparseApplyAdagrad.Options |
Optional attributes for
ResourceSparseApplyAdagrad |
ResourceSparseApplyAdagradDa |
Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
|
ResourceSparseApplyAdagradDa.Options |
Optional attributes for
ResourceSparseApplyAdagradDa |
ResourceSparseApplyCenteredRmsProp |
Update '*var' according to the centered RMSProp algorithm.
|
ResourceSparseApplyCenteredRmsProp.Options |
Optional attributes for
ResourceSparseApplyCenteredRmsProp |
ResourceSparseApplyFtrl |
Update relevant entries in '*var' according to the Ftrl-proximal scheme.
|
ResourceSparseApplyFtrl.Options |
Optional attributes for
ResourceSparseApplyFtrl |
ResourceSparseApplyMomentum |
Update relevant entries in '*var' and '*accum' according to the momentum scheme.
|
ResourceSparseApplyMomentum.Options |
Optional attributes for
ResourceSparseApplyMomentum |
ResourceSparseApplyProximalAdagrad |
Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
|
ResourceSparseApplyProximalAdagrad.Options |
Optional attributes for
ResourceSparseApplyProximalAdagrad |
ResourceSparseApplyProximalGradientDescent |
Sparse update '*var' as FOBOS algorithm with fixed learning rate.
|
ResourceSparseApplyProximalGradientDescent.Options |
Optional attributes for
ResourceSparseApplyProximalGradientDescent |
ResourceSparseApplyRmsProp |
Update '*var' according to the RMSProp algorithm.
|
ResourceSparseApplyRmsProp.Options |
Optional attributes for
ResourceSparseApplyRmsProp |
Restore |
Restores tensors from a V2 checkpoint.
|
RestoreSlice<T> |
Restores a tensor from checkpoint files.
|
RestoreSlice.Options |
Optional attributes for
RestoreSlice |
Save |
Saves tensors in V2 checkpoint format.
|
SaveSlices |
Saves input tensors slices to disk.
|
SdcaFprint |
Computes fingerprints of the input strings.
|
SdcaOptimizer |
Distributed version of Stochastic Dual Coordinate Ascent (SDCA) optimizer for
|
SdcaOptimizer.Options |
Optional attributes for
SdcaOptimizer |
SdcaShrinkL1 |
Applies L1 regularization shrink step on the parameters.
|
SparseApplyAdadelta<T> |
var: Should be from a Variable().
|
SparseApplyAdadelta.Options |
Optional attributes for
SparseApplyAdadelta |
SparseApplyAdagrad<T> |
Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
|
SparseApplyAdagrad.Options |
Optional attributes for
SparseApplyAdagrad |
SparseApplyAdagradDa<T> |
Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
|
SparseApplyAdagradDa.Options |
Optional attributes for
SparseApplyAdagradDa |
SparseApplyCenteredRmsProp<T> |
Update '*var' according to the centered RMSProp algorithm.
|
SparseApplyCenteredRmsProp.Options |
Optional attributes for
SparseApplyCenteredRmsProp |
SparseApplyFtrl<T> |
Update relevant entries in '*var' according to the Ftrl-proximal scheme.
|
SparseApplyFtrl.Options |
Optional attributes for
SparseApplyFtrl |
SparseApplyMomentum<T> |
Update relevant entries in '*var' and '*accum' according to the momentum scheme.
|
SparseApplyMomentum.Options |
Optional attributes for
SparseApplyMomentum |
SparseApplyProximalAdagrad<T> |
Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
|
SparseApplyProximalAdagrad.Options |
Optional attributes for
SparseApplyProximalAdagrad |
SparseApplyProximalGradientDescent<T> |
Sparse update '*var' as FOBOS algorithm with fixed learning rate.
|
SparseApplyProximalGradientDescent.Options |
Optional attributes for
SparseApplyProximalGradientDescent |
SparseApplyRmsProp<T> |
Update '*var' according to the RMSProp algorithm.
|
SparseApplyRmsProp.Options |
Optional attributes for
SparseApplyRmsProp |
TileGrad<T> |
Returns the gradient of `Tile`.
|
Copyright © 2022. All rights reserved.