Abs
Layers
Adagrad
Optimizers
Adam
Optimizers
AnyLayerOps
DifferentiableAny
Aux
Layer
Batch
ToLayer
ToLiteral
abs
MathFunctions
abs(Double)
DifferentiableDouble
abs(Float)
DifferentiableFloat
abs(INDArray)
DifferentiableINDArray
addReference
ReferenceCount
Weight
Output
Output
Output
Output
Output
Weight
Weight
Output
Output
Output
Output
Weight
Weight
Throw
Output
Output
Batch
Literal
anyToLiteral
DifferentiableAny
apply
Trainable
Weight
Weight
Weight
SeqLayerOps
autoToLayer
Symbolic