public class RnnOutputLayer extends BaseOutputLayer<RnnOutputLayer>
BaseOutputLayer, OutputLayer,
Serialized FormLayer.TrainingMode, Layer.Typelabelsconf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, optimizer, params, paramsFlattened, score| Constructor and Description |
|---|
RnnOutputLayer(NeuralNetConfiguration conf) |
RnnOutputLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
double |
f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
Returns the f1 score for the given examples.
|
org.nd4j.linalg.api.ndarray.INDArray |
getInput() |
protected org.nd4j.linalg.api.ndarray.INDArray |
getLabels2d() |
org.nd4j.linalg.api.ndarray.INDArray |
output(boolean training)
Classify input
|
org.nd4j.linalg.api.ndarray.INDArray |
output(org.nd4j.linalg.api.ndarray.INDArray input) |
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
Raw activations
|
protected org.nd4j.linalg.api.ndarray.INDArray |
preOutput2d(boolean training) |
void |
setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray) |
Layer.Type |
type()
Returns the layer type
|
activate, activate, activate, clear, computeGradientAndScore, computeScore, computeScoreForExamples, f1Score, fit, fit, fit, fit, fit, getLabels, gradient, gradientAndScore, iterate, labelProbabilities, numLabels, output, predict, predict, setLabels, setScoreWithZaccumulateScore, activate, activate, activationMean, applyDropOutIfNecessary, applyLearningRateScoreDecay, batchSize, calcGradient, calcL1, calcL2, clone, conf, createGradient, derivativeActivation, error, fit, getIndex, getInputMiniBatchSize, getListeners, getOptimizer, getParam, initParams, input, layerConf, merge, numParams, numParams, params, paramTable, preOutput, preOutput, preOutput, score, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, transpose, update, update, validateInputequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitactivate, activate, activationMean, calcGradient, calcL1, calcL2, clone, derivativeActivation, error, getIndex, getInputMiniBatchSize, getListeners, merge, preOutput, preOutput, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, transposeaccumulateScore, applyLearningRateScoreDecay, batchSize, conf, fit, getOptimizer, getParam, initParams, input, numParams, numParams, params, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInputpublic RnnOutputLayer(NeuralNetConfiguration conf)
public RnnOutputLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseOutputLayer<RnnOutputLayer>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public double f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
f1Score in interface Classifierf1Score in class BaseOutputLayer<RnnOutputLayer>examples - te the examples to classify (one example in each row)labels - the true labelspublic org.nd4j.linalg.api.ndarray.INDArray getInput()
getInput in class BaseLayer<RnnOutputLayer>public Layer.Type type()
Layertype in interface Layertype in class BaseLayer<RnnOutputLayer>public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
LayerpreOutput in interface LayerpreOutput in class BaseLayer<RnnOutputLayer>x - the input to transformprotected org.nd4j.linalg.api.ndarray.INDArray preOutput2d(boolean training)
preOutput2d in class BaseOutputLayer<RnnOutputLayer>protected org.nd4j.linalg.api.ndarray.INDArray getLabels2d()
getLabels2d in class BaseOutputLayer<RnnOutputLayer>public org.nd4j.linalg.api.ndarray.INDArray output(org.nd4j.linalg.api.ndarray.INDArray input)
output in class BaseOutputLayer<RnnOutputLayer>public org.nd4j.linalg.api.ndarray.INDArray output(boolean training)
BaseOutputLayeroutput in class BaseOutputLayer<RnnOutputLayer>training - determines if its training
the input (can either be a matrix or vector)
If it's a matrix, each row is considered an example
and associated rows are classified accordingly.
Each row will be the likelihood of a label given that examplepublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layeractivate in interface Layeractivate in class BaseLayer<RnnOutputLayer>training - training or test modepublic void setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
setMaskArray in interface LayersetMaskArray in class BaseLayer<RnnOutputLayer>Copyright © 2016. All Rights Reserved.