public class ElementWiseMultiplicationLayer extends BaseLayer<ElementWiseMultiplicationLayer>
created by jingshu
Layer.TrainingMode, Layer.Type
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParams
cacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
ElementWiseMultiplicationLayer(NeuralNetConfiguration conf) |
ElementWiseMultiplicationLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (VAE, RBMs etc)
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(boolean training,
LayerWorkspaceMgr workspaceMgr) |
accumulateScore, activate, calcL1, calcL2, clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, hasBias, initParams, layerConf, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setParam, setParams, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, transpose, update, update
activate, addListeners, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, batchSize, conf, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
getEpochCount, getIterationCount, setEpochCount, setIterationCount
public ElementWiseMultiplicationLayer(NeuralNetConfiguration conf)
public ElementWiseMultiplicationLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseLayer<ElementWiseMultiplicationLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic boolean isPretrainLayer()
public org.nd4j.linalg.api.ndarray.INDArray preOutput(boolean training, LayerWorkspaceMgr workspaceMgr)
preOutput
in class BaseLayer<ElementWiseMultiplicationLayer>
Copyright © 2018. All rights reserved.