public class DropoutLayer extends BaseLayer<DropoutLayer>
Layer.TrainingMode, Layer.Type
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParams
cacheMode, conf, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
DropoutLayer(NeuralNetConfiguration conf) |
DropoutLayer(NeuralNetConfiguration conf,
INDArray input) |
Modifier and Type | Method and Description |
---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
fit(INDArray input,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Layer.Type |
type()
Returns the layer type
|
clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, hasBias, layerConf, numParams, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, update
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, conf, feedForwardMaskArray, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
getIterationCount, setIterationCount
public DropoutLayer(NeuralNetConfiguration conf)
public DropoutLayer(NeuralNetConfiguration conf, INDArray input)
public double calcL2(boolean backpropParamsOnly)
Layer
calcL2
in interface Layer
calcL2
in class BaseLayer<DropoutLayer>
backpropParamsOnly
- If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
Layer
calcL1
in interface Layer
calcL1
in class BaseLayer<DropoutLayer>
backpropParamsOnly
- If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer.Type type()
Layer
type
in interface Layer
type
in class AbstractLayer<DropoutLayer>
public void fit(INDArray input, LayerWorkspaceMgr workspaceMgr)
Model
fit
in interface Model
fit
in class BaseLayer<DropoutLayer>
input
- the data to fit the model topublic Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseLayer<DropoutLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
activate
in interface Layer
activate
in class BaseLayer<DropoutLayer>
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic boolean isPretrainLayer()
Layer
Copyright © 2018. All rights reserved.