public class LossLayer extends FeedForwardLayer
Modifier and Type | Class and Description |
---|---|
static class |
LossLayer.Builder |
Modifier and Type | Field and Description |
---|---|
protected org.nd4j.linalg.lossfunctions.ILossFunction |
lossFn |
nIn, nOut
activationFn, adamMeanDecay, adamVarDecay, biasInit, biasLearningRate, dist, epsilon, gradientNormalization, gradientNormalizationThreshold, iUpdater, l1, l1Bias, l2, l2Bias, learningRate, learningRateSchedule, momentum, momentumSchedule, rho, rmsDecay, updater, weightInit
Modifier | Constructor and Description |
---|---|
protected |
LossLayer(LossLayer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<IterationListener> iterationListeners,
int layerIndex,
org.nd4j.linalg.api.ndarray.INDArray layerParamsView,
boolean initializeParams) |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
getL1ByParam, getL2ByParam, getLearningRateByParam, getOutputType, getPreProcessorForInputType, setNIn
clone, getIUpdaterByParam, getUpdaterByParam, resetLayerDefaultConfig
protected LossLayer(LossLayer.Builder builder)
public Layer instantiate(NeuralNetConfiguration conf, Collection<IterationListener> iterationListeners, int layerIndex, org.nd4j.linalg.api.ndarray.INDArray layerParamsView, boolean initializeParams)
instantiate
in class Layer
public boolean isPretrainParam(String paramName)
Layer
isPretrainParam
in class FeedForwardLayer
paramName
- Parameter name/keypublic LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input typepublic ParamInitializer initializer()
initializer
in class Layer
Copyright © 2017. All rights reserved.