public class Convolution1DLayer extends ConvolutionLayer
Layer.TrainingMode, Layer.TypeconvolutionMode, helper, i2d, loggradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solvercacheMode, conf, dropoutApplied, dropoutMask, index, input, iterationListeners, maskArray, maskState, preOutput| Constructor and Description |
|---|
Convolution1DLayer(NeuralNetConfiguration conf) |
Convolution1DLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(boolean training) |
protected org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
preOutput4d(boolean training,
boolean forBackprop)
preOutput4d: Used so that ConvolutionLayer subclasses (such as Convolution1DLayer) can maintain their standard
non-4d preOutput method, while overriding this to return 4d activations (for use in backprop) without modifying
the public API
|
activate, calcGradient, calcL1, calcL2, fit, isPretrainLayer, merge, params, preOutput, setParams, transpose, typeaccumulateScore, activate, activate, activationMean, applyLearningRateScoreDecay, clone, computeGradientAndScore, error, fit, getGradientsViewArray, getOptimizer, getParam, gradient, initParams, iterate, layerConf, numParams, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, updateactivate, activate, activate, addListeners, applyDropOutIfNecessary, applyMask, batchSize, clear, conf, derivativeActivation, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, preOutput, preOutput, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, validateInputpublic Convolution1DLayer(NeuralNetConfiguration conf)
public Convolution1DLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
LayerbackpropGradient in interface LayerbackpropGradient in class ConvolutionLayerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.protected org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> preOutput4d(boolean training,
boolean forBackprop)
ConvolutionLayerpreOutput4d in class ConvolutionLayerpublic org.nd4j.linalg.api.ndarray.INDArray preOutput(boolean training)
preOutput in class ConvolutionLayerCopyright © 2017. All rights reserved.