public abstract class BaseOutputLayer extends FeedForwardLayer
Modifier and Type | Class and Description |
---|---|
static class |
BaseOutputLayer.Builder<T extends BaseOutputLayer.Builder<T>> |
Modifier and Type | Field and Description |
---|---|
protected org.nd4j.linalg.lossfunctions.ILossFunction |
lossFn |
nIn, nOut
activationFn, adamMeanDecay, adamVarDecay, biasInit, biasLearningRate, dist, epsilon, gradientNormalization, gradientNormalizationThreshold, iUpdater, l1, l1Bias, l2, l2Bias, learningRate, learningRateSchedule, momentum, momentumSchedule, rho, rmsDecay, updater, weightInit
Modifier | Constructor and Description |
---|---|
protected |
BaseOutputLayer(BaseOutputLayer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction |
getLossFunction()
Deprecated.
As of 0.6.0. Use
#getLossFn() instead |
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
getL1ByParam, getL2ByParam, getLearningRateByParam, getOutputType, getPreProcessorForInputType, isPretrainParam, setNIn
clone, getIUpdaterByParam, getUpdaterByParam, resetLayerDefaultConfig
initializer, instantiate
protected BaseOutputLayer(BaseOutputLayer.Builder builder)
@Deprecated public org.nd4j.linalg.lossfunctions.LossFunctions.LossFunction getLossFunction()
#getLossFn()
insteadpublic LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input typeCopyright © 2017. All rights reserved.