public class RnnOutputLayer extends BaseOutputLayer<RnnOutputLayer>
BaseOutputLayer, OutputLayer
,
Serialized FormLayer.TrainingMode, Layer.Type
labels
conf, dropoutApplied, dropoutMask, gradient, gradientsFlattened, gradientViews, index, input, iterationListeners, maskArray, optimizer, params, paramsFlattened, score
Constructor and Description |
---|
RnnOutputLayer(NeuralNetConfiguration conf) |
RnnOutputLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
double |
f1Score(org.nd4j.linalg.api.ndarray.INDArray examples,
org.nd4j.linalg.api.ndarray.INDArray labels)
Returns the f1 score for the given examples.
|
org.nd4j.linalg.api.ndarray.INDArray |
getInput() |
protected org.nd4j.linalg.api.ndarray.INDArray |
getLabels2d() |
org.nd4j.linalg.api.ndarray.INDArray |
output(boolean training)
Classify input
|
org.nd4j.linalg.api.ndarray.INDArray |
output(org.nd4j.linalg.api.ndarray.INDArray input) |
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
Raw activations
|
protected org.nd4j.linalg.api.ndarray.INDArray |
preOutput2d(boolean training) |
void |
setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray) |
Layer.Type |
type()
Returns the layer type
|
activate, activate, activate, clear, computeGradientAndScore, computeScore, computeScoreForExamples, f1Score, fit, fit, fit, fit, fit, getLabels, gradient, gradientAndScore, iterate, labelProbabilities, numLabels, output, predict, predict, setLabels, setScoreWithZ
accumulateScore, activate, activate, activationMean, applyDropOutIfNecessary, applyLearningRateScoreDecay, batchSize, calcGradient, calcL1, calcL2, clone, conf, createGradient, derivativeActivation, error, fit, getIndex, getInputMiniBatchSize, getListeners, getOptimizer, getParam, initParams, input, layerConf, merge, numParams, numParams, params, paramTable, preOutput, preOutput, preOutput, score, setBackpropGradientsViewArray, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setParam, setParams, setParams, setParamsViewArray, setParamTable, toString, transpose, update, update, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
activate, activate, activationMean, calcGradient, calcL1, calcL2, clone, derivativeActivation, error, getIndex, getInputMiniBatchSize, getListeners, merge, preOutput, preOutput, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, transpose
accumulateScore, applyLearningRateScoreDecay, batchSize, conf, fit, getOptimizer, getParam, initParams, input, numParams, numParams, params, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInput
public RnnOutputLayer(NeuralNetConfiguration conf)
public RnnOutputLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseOutputLayer<RnnOutputLayer>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public double f1Score(org.nd4j.linalg.api.ndarray.INDArray examples, org.nd4j.linalg.api.ndarray.INDArray labels)
f1Score
in interface Classifier
f1Score
in class BaseOutputLayer<RnnOutputLayer>
examples
- te the examples to classify (one example in each row)labels
- the true labelspublic org.nd4j.linalg.api.ndarray.INDArray getInput()
getInput
in class BaseLayer<RnnOutputLayer>
public Layer.Type type()
Layer
type
in interface Layer
type
in class BaseLayer<RnnOutputLayer>
public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x, boolean training)
Layer
preOutput
in interface Layer
preOutput
in class BaseLayer<RnnOutputLayer>
x
- the input to transformprotected org.nd4j.linalg.api.ndarray.INDArray preOutput2d(boolean training)
preOutput2d
in class BaseOutputLayer<RnnOutputLayer>
protected org.nd4j.linalg.api.ndarray.INDArray getLabels2d()
getLabels2d
in class BaseOutputLayer<RnnOutputLayer>
public org.nd4j.linalg.api.ndarray.INDArray output(org.nd4j.linalg.api.ndarray.INDArray input)
output
in class BaseOutputLayer<RnnOutputLayer>
public org.nd4j.linalg.api.ndarray.INDArray output(boolean training)
BaseOutputLayer
output
in class BaseOutputLayer<RnnOutputLayer>
training
- determines if its training
the input (can either be a matrix or vector)
If it's a matrix, each row is considered an example
and associated rows are classified accordingly.
Each row will be the likelihood of a label given that examplepublic org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layer
activate
in interface Layer
activate
in class BaseLayer<RnnOutputLayer>
training
- training or test modepublic void setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
setMaskArray
in interface Layer
setMaskArray
in class BaseLayer<RnnOutputLayer>
Copyright © 2016. All Rights Reserved.