public abstract class BaseOutputLayer extends FeedForwardLayer
Modifier and Type | Class and Description |
---|---|
static class |
BaseOutputLayer.Builder<T extends BaseOutputLayer.Builder<T>> |
Modifier and Type | Field and Description |
---|---|
protected org.nd4j.linalg.lossfunctions.ILossFunction |
lossFn |
nIn, nOut
activationFunction, adamMeanDecay, adamVarDecay, biasInit, biasL1, biasL2, biasLearningRate, dist, dropOut, epsilon, gradientNormalization, gradientNormalizationThreshold, l1, l2, layerName, learningRate, learningRateSchedule, momentum, momentumSchedule, rho, rmsDecay, updater, weightInit
Modifier | Constructor and Description |
---|---|
protected |
BaseOutputLayer(BaseOutputLayer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction |
getLossFunction()
Deprecated.
As of 0.6.0. Use
#getLossFn() instead |
getL1ByParam, getL2ByParam, getLearningRateByParam, getOutputType, getPreProcessorForInputType, setNIn
clone, getUpdaterByParam, initializer, instantiate
protected BaseOutputLayer(BaseOutputLayer.Builder builder)
@Deprecated public org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction getLossFunction()
#getLossFn()
insteadCopyright © 2016. All Rights Reserved.