public class Subsampling1DLayer extends SubsamplingLayer
Layer.TrainingMode, Layer.Type
convolutionMode, helper, helperCountFail
cacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
Subsampling1DLayer(NeuralNetConfiguration conf) |
Subsampling1DLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
accumulateScore, calcL1, calcL2, clearNoiseWeightParams, clone, fit, fit, getParam, gradient, isPretrainLayer, numParams, params, score, setParams, transpose, type, update
activate, addListeners, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, batchSize, clear, computeGradientAndScore, conf, feedForwardMaskArray, getGradientsViewArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradientAndScore, init, initParams, input, layerConf, layerId, numParams, paramTable, paramTable, setBackpropGradientsViewArray, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, update, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getEpochCount, getIterationCount, setEpochCount, setIterationCount
public Subsampling1DLayer(NeuralNetConfiguration conf)
public Subsampling1DLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
backpropGradient
in class SubsamplingLayer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
activate
in interface Layer
activate
in class SubsamplingLayer
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerCopyright © 2018. All rights reserved.