public class CenterLossOutputLayer extends BaseOutputLayer
average(embedding(y))
for all examples y in
Y"Modifier and Type | Class and Description |
---|---|
static class |
CenterLossOutputLayer.Builder |
Modifier and Type | Field and Description |
---|---|
protected double |
alpha |
protected boolean |
gradientCheck |
protected double |
lambda |
hasBias, lossFn
nIn, nOut
activationFn, biasInit, biasUpdater, gainInit, gradientNormalization, gradientNormalizationThreshold, iUpdater, regularization, regularizationBias, weightInitFn, weightNoise
constraints, iDropout, layerName
Modifier | Constructor and Description |
---|---|
protected |
CenterLossOutputLayer(CenterLossOutputLayer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
double |
getAlpha() |
boolean |
getGradientCheck() |
double |
getLambda() |
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
org.nd4j.linalg.api.buffer.DataType networkDataType) |
hasBias
getOutputType, getPreProcessorForInputType, isPretrainParam, setNIn
clone, getGradientNormalization, getRegularizationByParam, resetLayerDefaultConfig
initializeConstraints, setDataType
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getGradientNormalizationThreshold, getLayerName
protected double alpha
protected double lambda
protected boolean gradientCheck
protected CenterLossOutputLayer(CenterLossOutputLayer.Builder builder)
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, org.nd4j.linalg.api.buffer.DataType networkDataType)
instantiate
in class Layer
public ParamInitializer initializer()
initializer
in class Layer
public IUpdater getUpdaterByParam(String paramName)
BaseLayer
getUpdaterByParam
in interface TrainingConfig
getUpdaterByParam
in class BaseLayer
paramName
- Parameter namepublic double getAlpha()
public double getLambda()
public boolean getGradientCheck()
public LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class BaseOutputLayer
inputType
- Input type to the layer. Memory consumption is often a function of the input
typeCopyright © 2019. All rights reserved.