Package | Description |
---|---|
org.tensorflow.op | |
org.tensorflow.op.train |
Class and Description |
---|
AccumulatorApplyGradient
Applies a gradient to a given accumulator.
|
AccumulatorNumAccumulated
Returns the number of gradients aggregated in the given accumulators.
|
AccumulatorSetGlobalStep
Updates the accumulator with a new value for global_step.
|
AccumulatorTakeGradient
Extracts the average gradient in the given ConditionalAccumulator.
|
ApplyAdadelta
Update '*var' according to the adadelta scheme.
|
ApplyAdadelta.Options
Optional attributes for
ApplyAdadelta |
ApplyAdagrad
Update '*var' according to the adagrad scheme.
|
ApplyAdagrad.Options
Optional attributes for
ApplyAdagrad |
ApplyAdagradDa
Update '*var' according to the proximal adagrad scheme.
|
ApplyAdagradDa.Options
Optional attributes for
ApplyAdagradDa |
ApplyAdam
Update '*var' according to the Adam algorithm.
|
ApplyAdam.Options
Optional attributes for
ApplyAdam |
ApplyAddSign
Update '*var' according to the AddSign update.
|
ApplyAddSign.Options
Optional attributes for
ApplyAddSign |
ApplyCenteredRmsProp
Update '*var' according to the centered RMSProp algorithm.
|
ApplyCenteredRmsProp.Options
Optional attributes for
ApplyCenteredRmsProp |
ApplyFtrl
Update '*var' according to the Ftrl-proximal scheme.
|
ApplyFtrl.Options
Optional attributes for
ApplyFtrl |
ApplyGradientDescent
Update '*var' by subtracting 'alpha' * 'delta' from it.
|
ApplyGradientDescent.Options
Optional attributes for
ApplyGradientDescent |
ApplyMomentum
Update '*var' according to the momentum scheme.
|
ApplyMomentum.Options
Optional attributes for
ApplyMomentum |
ApplyPowerSign
Update '*var' according to the AddSign update.
|
ApplyPowerSign.Options
Optional attributes for
ApplyPowerSign |
ApplyProximalAdagrad
Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
|
ApplyProximalAdagrad.Options
Optional attributes for
ApplyProximalAdagrad |
ApplyProximalGradientDescent
Update '*var' as FOBOS algorithm with fixed learning rate.
|
ApplyProximalGradientDescent.Options
Optional attributes for
ApplyProximalGradientDescent |
ApplyRmsProp
Update '*var' according to the RMSProp algorithm.
|
ApplyRmsProp.Options
Optional attributes for
ApplyRmsProp |
ConditionalAccumulator
A conditional accumulator for aggregating gradients.
|
ConditionalAccumulator.Options
Optional attributes for
ConditionalAccumulator |
GenerateVocabRemapping
Given a path to new and old vocabulary files, returns a remapping Tensor of
|
GenerateVocabRemapping.Options
Optional attributes for
GenerateVocabRemapping |
MergeV2Checkpoints
V2 format specific: merges the metadata files of sharded checkpoints.
|
MergeV2Checkpoints.Options
Optional attributes for
MergeV2Checkpoints |
NegTrain
Training via negative sampling.
|
PreventGradient
An identity op that triggers an error if a gradient is requested.
|
PreventGradient.Options
Optional attributes for
PreventGradient |
ResourceApplyAdadelta
Update '*var' according to the adadelta scheme.
|
ResourceApplyAdadelta.Options
Optional attributes for
ResourceApplyAdadelta |
ResourceApplyAdagrad
Update '*var' according to the adagrad scheme.
|
ResourceApplyAdagrad.Options
Optional attributes for
ResourceApplyAdagrad |
ResourceApplyAdagradDa
Update '*var' according to the proximal adagrad scheme.
|
ResourceApplyAdagradDa.Options
Optional attributes for
ResourceApplyAdagradDa |
ResourceApplyAdam
Update '*var' according to the Adam algorithm.
|
ResourceApplyAdam.Options
Optional attributes for
ResourceApplyAdam |
ResourceApplyAddSign
Update '*var' according to the AddSign update.
|
ResourceApplyAddSign.Options
Optional attributes for
ResourceApplyAddSign |
ResourceApplyCenteredRmsProp
Update '*var' according to the centered RMSProp algorithm.
|
ResourceApplyCenteredRmsProp.Options
Optional attributes for
ResourceApplyCenteredRmsProp |
ResourceApplyFtrl
Update '*var' according to the Ftrl-proximal scheme.
|
ResourceApplyFtrl.Options
Optional attributes for
ResourceApplyFtrl |
ResourceApplyGradientDescent
Update '*var' by subtracting 'alpha' * 'delta' from it.
|
ResourceApplyGradientDescent.Options
Optional attributes for
ResourceApplyGradientDescent |
ResourceApplyMomentum
Update '*var' according to the momentum scheme.
|
ResourceApplyMomentum.Options
Optional attributes for
ResourceApplyMomentum |
ResourceApplyPowerSign
Update '*var' according to the AddSign update.
|
ResourceApplyPowerSign.Options
Optional attributes for
ResourceApplyPowerSign |
ResourceApplyProximalAdagrad
Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
|
ResourceApplyProximalAdagrad.Options
Optional attributes for
ResourceApplyProximalAdagrad |
ResourceApplyProximalGradientDescent
Update '*var' as FOBOS algorithm with fixed learning rate.
|
ResourceApplyProximalGradientDescent.Options
Optional attributes for
ResourceApplyProximalGradientDescent |
ResourceApplyRmsProp
Update '*var' according to the RMSProp algorithm.
|
ResourceApplyRmsProp.Options
Optional attributes for
ResourceApplyRmsProp |
ResourceSparseApplyAdadelta
var: Should be from a Variable().
|
ResourceSparseApplyAdadelta.Options
Optional attributes for
ResourceSparseApplyAdadelta |
ResourceSparseApplyAdagrad
Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
|
ResourceSparseApplyAdagrad.Options
Optional attributes for
ResourceSparseApplyAdagrad |
ResourceSparseApplyAdagradDa
Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
|
ResourceSparseApplyAdagradDa.Options
Optional attributes for
ResourceSparseApplyAdagradDa |
ResourceSparseApplyCenteredRmsProp
Update '*var' according to the centered RMSProp algorithm.
|
ResourceSparseApplyCenteredRmsProp.Options
Optional attributes for
ResourceSparseApplyCenteredRmsProp |
ResourceSparseApplyFtrl
Update relevant entries in '*var' according to the Ftrl-proximal scheme.
|
ResourceSparseApplyFtrl.Options
Optional attributes for
ResourceSparseApplyFtrl |
ResourceSparseApplyMomentum
Update relevant entries in '*var' and '*accum' according to the momentum scheme.
|
ResourceSparseApplyMomentum.Options
Optional attributes for
ResourceSparseApplyMomentum |
ResourceSparseApplyProximalAdagrad
Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
|
ResourceSparseApplyProximalAdagrad.Options
Optional attributes for
ResourceSparseApplyProximalAdagrad |
ResourceSparseApplyProximalGradientDescent
Sparse update '*var' as FOBOS algorithm with fixed learning rate.
|
ResourceSparseApplyProximalGradientDescent.Options
Optional attributes for
ResourceSparseApplyProximalGradientDescent |
ResourceSparseApplyRmsProp
Update '*var' according to the RMSProp algorithm.
|
ResourceSparseApplyRmsProp.Options
Optional attributes for
ResourceSparseApplyRmsProp |
Restore
Restores tensors from a V2 checkpoint.
|
RestoreSlice
Restores a tensor from checkpoint files.
|
RestoreSlice.Options
Optional attributes for
RestoreSlice |
Save
Saves tensors in V2 checkpoint format.
|
SaveSlices
Saves input tensors slices to disk.
|
SdcaFprint
Computes fingerprints of the input strings.
|
SdcaShrinkL1
Applies L1 regularization shrink step on the parameters.
|
SparseApplyAdadelta
var: Should be from a Variable().
|
SparseApplyAdadelta.Options
Optional attributes for
SparseApplyAdadelta |
SparseApplyAdagrad
Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
|
SparseApplyAdagrad.Options
Optional attributes for
SparseApplyAdagrad |
SparseApplyAdagradDa
Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
|
SparseApplyAdagradDa.Options
Optional attributes for
SparseApplyAdagradDa |
SparseApplyCenteredRmsProp
Update '*var' according to the centered RMSProp algorithm.
|
SparseApplyCenteredRmsProp.Options
Optional attributes for
SparseApplyCenteredRmsProp |
SparseApplyFtrl
Update relevant entries in '*var' according to the Ftrl-proximal scheme.
|
SparseApplyFtrl.Options
Optional attributes for
SparseApplyFtrl |
SparseApplyMomentum
Update relevant entries in '*var' and '*accum' according to the momentum scheme.
|
SparseApplyMomentum.Options
Optional attributes for
SparseApplyMomentum |
SparseApplyProximalAdagrad
Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
|
SparseApplyProximalAdagrad.Options
Optional attributes for
SparseApplyProximalAdagrad |
SparseApplyProximalGradientDescent
Sparse update '*var' as FOBOS algorithm with fixed learning rate.
|
SparseApplyProximalGradientDescent.Options
Optional attributes for
SparseApplyProximalGradientDescent |
SparseApplyRmsProp
Update '*var' according to the RMSProp algorithm.
|
SparseApplyRmsProp.Options
Optional attributes for
SparseApplyRmsProp |
TileGrad
Returns the gradient of `Tile`.
|
Class and Description |
---|
AccumulatorApplyGradient
Applies a gradient to a given accumulator.
|
AccumulatorNumAccumulated
Returns the number of gradients aggregated in the given accumulators.
|
AccumulatorSetGlobalStep
Updates the accumulator with a new value for global_step.
|
AccumulatorTakeGradient
Extracts the average gradient in the given ConditionalAccumulator.
|
ApplyAdadelta
Update '*var' according to the adadelta scheme.
|
ApplyAdadelta.Options
Optional attributes for
ApplyAdadelta |
ApplyAdagrad
Update '*var' according to the adagrad scheme.
|
ApplyAdagrad.Options
Optional attributes for
ApplyAdagrad |
ApplyAdagradDa
Update '*var' according to the proximal adagrad scheme.
|
ApplyAdagradDa.Options
Optional attributes for
ApplyAdagradDa |
ApplyAdam
Update '*var' according to the Adam algorithm.
|
ApplyAdam.Options
Optional attributes for
ApplyAdam |
ApplyAdaMax
Update '*var' according to the AdaMax algorithm.
|
ApplyAdaMax.Options
Optional attributes for
ApplyAdaMax |
ApplyAddSign
Update '*var' according to the AddSign update.
|
ApplyAddSign.Options
Optional attributes for
ApplyAddSign |
ApplyCenteredRmsProp
Update '*var' according to the centered RMSProp algorithm.
|
ApplyCenteredRmsProp.Options
Optional attributes for
ApplyCenteredRmsProp |
ApplyFtrl
Update '*var' according to the Ftrl-proximal scheme.
|
ApplyFtrl.Options
Optional attributes for
ApplyFtrl |
ApplyGradientDescent
Update '*var' by subtracting 'alpha' * 'delta' from it.
|
ApplyGradientDescent.Options
Optional attributes for
ApplyGradientDescent |
ApplyMomentum
Update '*var' according to the momentum scheme.
|
ApplyMomentum.Options
Optional attributes for
ApplyMomentum |
ApplyPowerSign
Update '*var' according to the AddSign update.
|
ApplyPowerSign.Options
Optional attributes for
ApplyPowerSign |
ApplyProximalAdagrad
Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
|
ApplyProximalAdagrad.Options
Optional attributes for
ApplyProximalAdagrad |
ApplyProximalGradientDescent
Update '*var' as FOBOS algorithm with fixed learning rate.
|
ApplyProximalGradientDescent.Options
Optional attributes for
ApplyProximalGradientDescent |
ApplyRmsProp
Update '*var' according to the RMSProp algorithm.
|
ApplyRmsProp.Options
Optional attributes for
ApplyRmsProp |
ConditionalAccumulator
A conditional accumulator for aggregating gradients.
|
ConditionalAccumulator.Options
Optional attributes for
ConditionalAccumulator |
GenerateVocabRemapping
Given a path to new and old vocabulary files, returns a remapping Tensor of
|
GenerateVocabRemapping.Options
Optional attributes for
GenerateVocabRemapping |
MergeV2Checkpoints
V2 format specific: merges the metadata files of sharded checkpoints.
|
MergeV2Checkpoints.Options
Optional attributes for
MergeV2Checkpoints |
NegTrain
Training via negative sampling.
|
PreventGradient
An identity op that triggers an error if a gradient is requested.
|
PreventGradient.Options
Optional attributes for
PreventGradient |
ResourceApplyAdadelta
Update '*var' according to the adadelta scheme.
|
ResourceApplyAdadelta.Options
Optional attributes for
ResourceApplyAdadelta |
ResourceApplyAdagrad
Update '*var' according to the adagrad scheme.
|
ResourceApplyAdagrad.Options
Optional attributes for
ResourceApplyAdagrad |
ResourceApplyAdagradDa
Update '*var' according to the proximal adagrad scheme.
|
ResourceApplyAdagradDa.Options
Optional attributes for
ResourceApplyAdagradDa |
ResourceApplyAdam
Update '*var' according to the Adam algorithm.
|
ResourceApplyAdam.Options
Optional attributes for
ResourceApplyAdam |
ResourceApplyAdaMax
Update '*var' according to the AdaMax algorithm.
|
ResourceApplyAdaMax.Options
Optional attributes for
ResourceApplyAdaMax |
ResourceApplyAddSign
Update '*var' according to the AddSign update.
|
ResourceApplyAddSign.Options
Optional attributes for
ResourceApplyAddSign |
ResourceApplyCenteredRmsProp
Update '*var' according to the centered RMSProp algorithm.
|
ResourceApplyCenteredRmsProp.Options
Optional attributes for
ResourceApplyCenteredRmsProp |
ResourceApplyFtrl
Update '*var' according to the Ftrl-proximal scheme.
|
ResourceApplyFtrl.Options
Optional attributes for
ResourceApplyFtrl |
ResourceApplyGradientDescent
Update '*var' by subtracting 'alpha' * 'delta' from it.
|
ResourceApplyGradientDescent.Options
Optional attributes for
ResourceApplyGradientDescent |
ResourceApplyMomentum
Update '*var' according to the momentum scheme.
|
ResourceApplyMomentum.Options
Optional attributes for
ResourceApplyMomentum |
ResourceApplyPowerSign
Update '*var' according to the AddSign update.
|
ResourceApplyPowerSign.Options
Optional attributes for
ResourceApplyPowerSign |
ResourceApplyProximalAdagrad
Update '*var' and '*accum' according to FOBOS with Adagrad learning rate.
|
ResourceApplyProximalAdagrad.Options
Optional attributes for
ResourceApplyProximalAdagrad |
ResourceApplyProximalGradientDescent
Update '*var' as FOBOS algorithm with fixed learning rate.
|
ResourceApplyProximalGradientDescent.Options
Optional attributes for
ResourceApplyProximalGradientDescent |
ResourceApplyRmsProp
Update '*var' according to the RMSProp algorithm.
|
ResourceApplyRmsProp.Options
Optional attributes for
ResourceApplyRmsProp |
ResourceSparseApplyAdadelta
var: Should be from a Variable().
|
ResourceSparseApplyAdadelta.Options
Optional attributes for
ResourceSparseApplyAdadelta |
ResourceSparseApplyAdagrad
Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
|
ResourceSparseApplyAdagrad.Options
Optional attributes for
ResourceSparseApplyAdagrad |
ResourceSparseApplyAdagradDa
Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
|
ResourceSparseApplyAdagradDa.Options
Optional attributes for
ResourceSparseApplyAdagradDa |
ResourceSparseApplyCenteredRmsProp
Update '*var' according to the centered RMSProp algorithm.
|
ResourceSparseApplyCenteredRmsProp.Options
Optional attributes for
ResourceSparseApplyCenteredRmsProp |
ResourceSparseApplyFtrl
Update relevant entries in '*var' according to the Ftrl-proximal scheme.
|
ResourceSparseApplyFtrl.Options
Optional attributes for
ResourceSparseApplyFtrl |
ResourceSparseApplyMomentum
Update relevant entries in '*var' and '*accum' according to the momentum scheme.
|
ResourceSparseApplyMomentum.Options
Optional attributes for
ResourceSparseApplyMomentum |
ResourceSparseApplyProximalAdagrad
Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
|
ResourceSparseApplyProximalAdagrad.Options
Optional attributes for
ResourceSparseApplyProximalAdagrad |
ResourceSparseApplyProximalGradientDescent
Sparse update '*var' as FOBOS algorithm with fixed learning rate.
|
ResourceSparseApplyProximalGradientDescent.Options
Optional attributes for
ResourceSparseApplyProximalGradientDescent |
ResourceSparseApplyRmsProp
Update '*var' according to the RMSProp algorithm.
|
ResourceSparseApplyRmsProp.Options
Optional attributes for
ResourceSparseApplyRmsProp |
Restore
Restores tensors from a V2 checkpoint.
|
RestoreSlice
Restores a tensor from checkpoint files.
|
RestoreSlice.Options
Optional attributes for
RestoreSlice |
Save
Saves tensors in V2 checkpoint format.
|
SaveSlices
Saves input tensors slices to disk.
|
SdcaFprint
Computes fingerprints of the input strings.
|
SdcaOptimizer
Distributed version of Stochastic Dual Coordinate Ascent (SDCA) optimizer for
|
SdcaOptimizer.Options
Optional attributes for
SdcaOptimizer |
SdcaShrinkL1
Applies L1 regularization shrink step on the parameters.
|
SparseApplyAdadelta
var: Should be from a Variable().
|
SparseApplyAdadelta.Options
Optional attributes for
SparseApplyAdadelta |
SparseApplyAdagrad
Update relevant entries in '*var' and '*accum' according to the adagrad scheme.
|
SparseApplyAdagrad.Options
Optional attributes for
SparseApplyAdagrad |
SparseApplyAdagradDa
Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
|
SparseApplyAdagradDa.Options
Optional attributes for
SparseApplyAdagradDa |
SparseApplyCenteredRmsProp
Update '*var' according to the centered RMSProp algorithm.
|
SparseApplyCenteredRmsProp.Options
Optional attributes for
SparseApplyCenteredRmsProp |
SparseApplyFtrl
Update relevant entries in '*var' according to the Ftrl-proximal scheme.
|
SparseApplyFtrl.Options
Optional attributes for
SparseApplyFtrl |
SparseApplyMomentum
Update relevant entries in '*var' and '*accum' according to the momentum scheme.
|
SparseApplyMomentum.Options
Optional attributes for
SparseApplyMomentum |
SparseApplyProximalAdagrad
Sparse update entries in '*var' and '*accum' according to FOBOS algorithm.
|
SparseApplyProximalAdagrad.Options
Optional attributes for
SparseApplyProximalAdagrad |
SparseApplyProximalGradientDescent
Sparse update '*var' as FOBOS algorithm with fixed learning rate.
|
SparseApplyProximalGradientDescent.Options
Optional attributes for
SparseApplyProximalGradientDescent |
SparseApplyRmsProp
Update '*var' according to the RMSProp algorithm.
|
SparseApplyRmsProp.Options
Optional attributes for
SparseApplyRmsProp |
TileGrad
Returns the gradient of `Tile`.
|
Copyright © 2015–2019. All rights reserved.