public class Convolution1DLayer extends ConvolutionLayer
Layer.TrainingMode, Layer.Type
convolutionMode, helper, i2d, log
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver
cacheMode, conf, dropoutApplied, dropoutMask, index, input, iterationListeners, maskArray, maskState, preOutput
Constructor and Description |
---|
Convolution1DLayer(NeuralNetConfiguration conf) |
Convolution1DLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(boolean training) |
protected org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
preOutput4d(boolean training,
boolean forBackprop)
preOutput4d: Used so that ConvolutionLayer subclasses (such as Convolution1DLayer) can maintain their standard
non-4d preOutput method, while overriding this to return 4d activations (for use in backprop) without modifying
the public API
|
activate, calcGradient, calcL1, calcL2, fit, isPretrainLayer, merge, params, preOutput, setParams, transpose, type
accumulateScore, activate, activate, activationMean, applyLearningRateScoreDecay, clone, computeGradientAndScore, error, fit, getGradientsViewArray, getOptimizer, getParam, gradient, initParams, iterate, layerConf, numParams, paramTable, paramTable, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParamsViewArray, setParamTable, setScoreWithZ, toString, update, update
activate, activate, activate, addListeners, applyDropOutIfNecessary, applyMask, batchSize, clear, conf, derivativeActivation, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, preOutput, preOutput, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, validateInput
public Convolution1DLayer(NeuralNetConfiguration conf)
public Convolution1DLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class ConvolutionLayer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.protected org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> preOutput4d(boolean training, boolean forBackprop)
ConvolutionLayer
preOutput4d
in class ConvolutionLayer
public org.nd4j.linalg.api.ndarray.INDArray preOutput(boolean training)
preOutput
in class ConvolutionLayer
Copyright © 2017. All rights reserved.