public class SeparableConvolution2DLayer extends ConvolutionLayer
Layer.TrainingMode, Layer.Type
convolutionMode, dummyBias, dummyBiasGrad, helper, helperCountFail, i2d, log
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParams
cacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, iterationListeners, maskArray, maskState, preOutput
Constructor and Description |
---|
SeparableConvolution2DLayer(NeuralNetConfiguration conf) |
SeparableConvolution2DLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(boolean training) |
protected org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
preOutput(boolean training,
boolean forBackprop)
PreOutput method that also returns the im2col2d array (if being called for backprop), as this can be re-used
instead of being calculated again.
|
calcL1, calcL2, fit, hasBias, isPretrainLayer, params, preOutput4d, setParams, transpose, type
accumulateScore, activate, activate, clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, initParams, iterate, layerConf, numParams, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, update
activate, activate, activate, addListeners, applyConstraints, applyDropOutIfNecessary, applyMask, batchSize, conf, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, migrateInput, numParams, preOutput, preOutput, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
getEpochCount, getIterationCount, setEpochCount, setIterationCount
public SeparableConvolution2DLayer(NeuralNetConfiguration conf)
public SeparableConvolution2DLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class ConvolutionLayer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public org.nd4j.linalg.api.ndarray.INDArray preOutput(boolean training)
preOutput
in class ConvolutionLayer
protected org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> preOutput(boolean training, boolean forBackprop)
ConvolutionLayer
preOutput
in class ConvolutionLayer
training
- Train or test time (impacts dropout)forBackprop
- If true: return the im2col2d array for re-use during backprop. False: return null for second
pair entry. Note that it may still be null in the case of CuDNN and the like.public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layer
activate
in interface Layer
activate
in class ConvolutionLayer
training
- training or test modeCopyright © 2018. All rights reserved.