public class Convolution3DLayer extends ConvolutionLayer
Layer.TrainingMode, Layer.TypeconvolutionMode, CUDA_CNN_HELPER_CLASS_NAME, dummyBias, dummyBiasGrad, helper, helperCountFail, i2dgradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver, weightNoiseParamscacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
Convolution3DLayer(NeuralNetConfiguration conf,
DataType dataType) |
| Modifier and Type | Method and Description |
|---|---|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
protected Pair<INDArray,INDArray> |
preOutput(boolean training,
boolean forBackprop,
LayerWorkspaceMgr workspaceMgr)
PreOutput method that also returns the im2col2d array (if being called for backprop), as this can be re-used
instead of being calculated again.
|
INDArray |
preOutput(boolean training,
LayerWorkspaceMgr workspaceMgr) |
activate, feedForwardMaskArray, fit, getHelper, hasBias, isPretrainLayer, preOutput4d, setParams, type, validateInputDepth, validateInputRankcalcRegularizationScore, clear, clearNoiseWeightParams, clone, computeGradientAndScore, fit, getGradientsViewArray, getOptimizer, getParam, getParamWithNoise, gradient, hasLayerNorm, layerConf, numParams, params, paramTable, paramTable, preOutputWithPreNorm, score, setBackpropGradientsViewArray, setParam, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, updateactivate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, close, conf, getConfig, getEpochCount, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, updaterDivideByMinibatchequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitgetIterationCount, setIterationCountpublic Convolution3DLayer(NeuralNetConfiguration conf, DataType dataType)
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface LayerbackpropGradient in class ConvolutionLayerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic INDArray preOutput(boolean training, LayerWorkspaceMgr workspaceMgr)
preOutput in class BaseLayer<ConvolutionLayer>protected Pair<INDArray,INDArray> preOutput(boolean training, boolean forBackprop, LayerWorkspaceMgr workspaceMgr)
ConvolutionLayerpreOutput in class ConvolutionLayertraining - Train or test time (impacts dropout)forBackprop - If true: return the im2col2d array for re-use during backprop. False: return null for second
pair entry. Note that it may still be null in the case of CuDNN and the like.Copyright © 2021. All rights reserved.