public class ZeroPadding1DLayer extends AbstractLayer<ZeroPadding1DLayer>
Layer.TrainingMode, Layer.Type
cacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
ZeroPadding1DLayer(NeuralNetConfiguration conf) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
clearNoiseWeightParams() |
Layer |
clone()
Clone the layer
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
Layer.Type |
type()
Returns the layer type
|
accumulateScore, activate, addListeners, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, batchSize, clear, computeGradientAndScore, conf, feedForwardMaskArray, fit, fit, getGradientsViewArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, getParam, gradient, gradientAndScore, init, initParams, input, layerConf, layerId, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, transpose, update, update, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getEpochCount, getIterationCount, setEpochCount, setIterationCount
public ZeroPadding1DLayer(NeuralNetConfiguration conf)
public boolean isPretrainLayer()
Layer
public void clearNoiseWeightParams()
public Layer.Type type()
Layer
type
in interface Layer
type
in class AbstractLayer<ZeroPadding1DLayer>
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic Layer clone()
Layer
clone
in interface Layer
clone
in class AbstractLayer<ZeroPadding1DLayer>
public double calcL1(boolean backpropParamsOnly)
Layer
calcL1
in interface Layer
calcL1
in class AbstractLayer<ZeroPadding1DLayer>
backpropParamsOnly
- If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL2(boolean backpropParamsOnly)
Layer
calcL2
in interface Layer
calcL2
in class AbstractLayer<ZeroPadding1DLayer>
backpropParamsOnly
- If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)Copyright © 2018. All rights reserved.