Class SameDiffOutputLayer
- java.lang.Object
-
- org.deeplearning4j.nn.layers.AbstractLayer<SameDiffOutputLayer>
-
- org.deeplearning4j.nn.layers.samediff.SameDiffOutputLayer
-
- All Implemented Interfaces:
Serializable
,Cloneable
,Classifier
,Layer
,IOutputLayer
,Model
,Trainable
public class SameDiffOutputLayer extends AbstractLayer<SameDiffOutputLayer> implements IOutputLayer
- See Also:
- Serialized Form
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from interface org.deeplearning4j.nn.api.Layer
Layer.TrainingMode, Layer.Type
-
-
Field Summary
Fields Modifier and Type Field Description protected INDArray
gradients
protected Map<String,INDArray>
gradTable
static String
INPUT_KEY
protected INDArray
labels
static String
LABELS_KEY
protected String
outputKey
protected SDVariable
outputVar
protected INDArray
params
protected Map<String,INDArray>
paramTable
protected SameDiff
sameDiff
-
Fields inherited from class org.deeplearning4j.nn.layers.AbstractLayer
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
-
-
Constructor Summary
Constructors Constructor Description SameDiffOutputLayer(NeuralNetConfiguration conf, DataType dataType)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description INDArray
activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set inputPair<Gradient,INDArray>
backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layervoid
clearNoiseWeightParams()
Layer
clone()
double
computeScore(double fullNetRegTerm, boolean training, LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.INDArray
computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.protected void
doInit()
double
f1Score(INDArray examples, INDArray labels)
Returns the f1 score for the given examples.double
f1Score(DataSet data)
Sets the input and labels and returns a score for the prediction wrt true labelsvoid
fit(INDArray examples, int[] labels)
Fit the modelvoid
fit(INDArray examples, INDArray labels)
Fit the modelvoid
fit(DataSet data)
Fit the modelvoid
fit(DataSetIterator iter)
Train the model based on the datasetiteratorINDArray
getGradientsViewArray()
INDArray
getParam(String param)
Get the parameterboolean
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)boolean
needsLabels()
Returns true if labels are required for this output layerint
numLabels()
Returns the number of possible labelslong
numParams()
The number of parameters for the modelINDArray
params()
Returns the parameters of the neural network as a flattened row vectorMap<String,INDArray>
paramTable()
The param tableMap<String,INDArray>
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop For many models (dense layers, etc) - all parameters are backprop parametersint[]
predict(INDArray examples)
Takes in a list of examples For each row, returns a labelList<String>
predict(DataSet dataSet)
Takes in a DataSet of examples For each row, returns a labelvoid
setBackpropGradientsViewArray(INDArray gradients)
Set the gradients array as a view of the full (backprop) network parameters NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.void
setParam(String key, INDArray val)
Set the parameter with a new ndarrayvoid
setParams(INDArray params)
Set the parameters for this model.protected void
setParams(INDArray params, char order)
void
setParamsViewArray(INDArray params)
Set the initial parameters array as a view of the full (backprop) network parameters NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.void
setParamTable(Map<String,INDArray> paramTable)
Setter for the param table-
Methods inherited from class org.deeplearning4j.nn.layers.AbstractLayer
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, calcRegularizationScore, clear, close, computeGradientAndScore, conf, feedForwardMaskArray, fit, fit, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, input, layerConf, layerId, numParams, score, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, update, update, updaterDivideByMinibatch
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.layers.IOutputLayer
getLabels, setLabels
-
Methods inherited from interface org.deeplearning4j.nn.api.Layer
activate, allowInputModification, calcRegularizationScore, feedForwardMaskArray, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArray, type
-
Methods inherited from interface org.deeplearning4j.nn.api.Model
addListeners, applyConstraints, batchSize, clear, close, computeGradientAndScore, conf, fit, fit, getOptimizer, gradient, gradientAndScore, init, input, numParams, score, setConf, update, update
-
Methods inherited from interface org.deeplearning4j.nn.api.Trainable
getConfig, updaterDivideByMinibatch
-
-
-
-
Field Detail
-
INPUT_KEY
public static final String INPUT_KEY
- See Also:
- Constant Field Values
-
LABELS_KEY
public static final String LABELS_KEY
- See Also:
- Constant Field Values
-
sameDiff
protected SameDiff sameDiff
-
outputVar
protected SDVariable outputVar
-
outputKey
protected String outputKey
-
labels
protected INDArray labels
-
params
protected INDArray params
-
gradients
protected INDArray gradients
-
-
Constructor Detail
-
SameDiffOutputLayer
public SameDiffOutputLayer(NeuralNetConfiguration conf, DataType dataType)
-
-
Method Detail
-
isPretrainLayer
public boolean isPretrainLayer()
Description copied from interface:Layer
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)- Specified by:
isPretrainLayer
in interfaceLayer
- Returns:
- true if the layer can be pretrained (using fit(INDArray), false otherwise
-
clearNoiseWeightParams
public void clearNoiseWeightParams()
- Specified by:
clearNoiseWeightParams
in interfaceLayer
-
activate
public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:Layer
Perform forward pass and return the activations array with the last set input- Specified by:
activate
in interfaceLayer
- Parameters:
training
- training or test modeworkspaceMgr
- Workspace manager- Returns:
- the activation (layer output) of the last specified input. Note that the returned array should be placed
in the
ArrayType.ACTIVATIONS
workspace via the workspace manager
-
backpropGradient
public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:Layer
Calculate the gradient relative to the error in the next layer- Specified by:
backpropGradient
in interfaceLayer
- Parameters:
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C is cost function a=sigma(z) is activation.workspaceMgr
- Workspace manager- Returns:
- Pair
where Gradient is gradient for this layer, INDArray is epsilon (activation gradient) needed by next layer, but before element-wise multiply by sigmaPrime(z). So for standard feed-forward layer, if this layer is L, then return.getSecond() == dL/dIn = (w^(L)*(delta^(L))^T)^T. Note that the returned array should be placed in the ArrayType.ACTIVATION_GRAD
workspace via the workspace manager
-
params
public INDArray params()
Returns the parameters of the neural network as a flattened row vector- Specified by:
params
in interfaceModel
- Specified by:
params
in interfaceTrainable
- Overrides:
params
in classAbstractLayer<SameDiffOutputLayer>
- Returns:
- the parameters of the neural network
-
getParam
public INDArray getParam(String param)
Description copied from interface:Model
Get the parameter- Specified by:
getParam
in interfaceModel
- Overrides:
getParam
in classAbstractLayer<SameDiffOutputLayer>
- Parameters:
param
- the key of the parameter- Returns:
- the parameter vector/matrix with that particular key
-
numParams
public long numParams()
Description copied from class:AbstractLayer
The number of parameters for the model- Specified by:
numParams
in interfaceModel
- Specified by:
numParams
in interfaceTrainable
- Overrides:
numParams
in classAbstractLayer<SameDiffOutputLayer>
- Returns:
- the number of parameters for the model
-
setParam
public void setParam(String key, INDArray val)
Description copied from interface:Model
Set the parameter with a new ndarray- Specified by:
setParam
in interfaceModel
- Overrides:
setParam
in classAbstractLayer<SameDiffOutputLayer>
- Parameters:
key
- the key to se tval
- the new ndarray
-
setParams
public void setParams(INDArray params)
Description copied from interface:Model
Set the parameters for this model. This expects a linear ndarray which then be unpacked internally relative to the expected ordering of the model- Specified by:
setParams
in interfaceModel
- Overrides:
setParams
in classAbstractLayer<SameDiffOutputLayer>
- Parameters:
params
- the parameters for the model
-
setParams
protected void setParams(INDArray params, char order)
- Overrides:
setParams
in classAbstractLayer<SameDiffOutputLayer>
-
setParamsViewArray
public void setParamsViewArray(INDArray params)
Description copied from interface:Model
Set the initial parameters array as a view of the full (backprop) network parameters NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.- Specified by:
setParamsViewArray
in interfaceModel
- Overrides:
setParamsViewArray
in classAbstractLayer<SameDiffOutputLayer>
- Parameters:
params
- a 1 x nParams row vector that is a view of the larger (MLN/CG) parameters array
-
getGradientsViewArray
public INDArray getGradientsViewArray()
- Specified by:
getGradientsViewArray
in interfaceModel
- Specified by:
getGradientsViewArray
in interfaceTrainable
- Overrides:
getGradientsViewArray
in classAbstractLayer<SameDiffOutputLayer>
- Returns:
- 1D gradients view array
-
setBackpropGradientsViewArray
public void setBackpropGradientsViewArray(INDArray gradients)
Description copied from interface:Model
Set the gradients array as a view of the full (backprop) network parameters NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.- Specified by:
setBackpropGradientsViewArray
in interfaceModel
- Overrides:
setBackpropGradientsViewArray
in classAbstractLayer<SameDiffOutputLayer>
- Parameters:
gradients
- a 1 x nParams row vector that is a view of the larger (MLN/CG) gradients array
-
setParamTable
public void setParamTable(Map<String,INDArray> paramTable)
Description copied from interface:Model
Setter for the param table- Specified by:
setParamTable
in interfaceModel
- Overrides:
setParamTable
in classAbstractLayer<SameDiffOutputLayer>
-
paramTable
public Map<String,INDArray> paramTable()
Description copied from interface:Model
The param table- Specified by:
paramTable
in interfaceModel
- Overrides:
paramTable
in classAbstractLayer<SameDiffOutputLayer>
- Returns:
-
paramTable
public Map<String,INDArray> paramTable(boolean backpropParamsOnly)
Description copied from interface:Model
Table of parameters by key, for backprop For many models (dense layers, etc) - all parameters are backprop parameters- Specified by:
paramTable
in interfaceModel
- Specified by:
paramTable
in interfaceTrainable
- Overrides:
paramTable
in classAbstractLayer<SameDiffOutputLayer>
- Parameters:
backpropParamsOnly
- If true, return backprop params only. If false: return all params (equivalent to paramsTable())- Returns:
- Parameter table
-
doInit
protected void doInit()
-
needsLabels
public boolean needsLabels()
Description copied from interface:IOutputLayer
Returns true if labels are required for this output layer- Specified by:
needsLabels
in interfaceIOutputLayer
- Returns:
- true if this output layer needs labels or not
-
computeScore
public double computeScore(double fullNetRegTerm, boolean training, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:IOutputLayer
Compute score after labels and input have been set.- Specified by:
computeScore
in interfaceIOutputLayer
- Parameters:
fullNetRegTerm
- Regularization score (l1/l2/weight decay) for the entire networktraining
- whether score should be calculated at train or test time (this affects things like application of dropout, etc)- Returns:
- score (loss function)
-
computeScoreForExamples
public INDArray computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
Description copied from interface:IOutputLayer
Compute the score for each example individually, after labels and input have been set.- Specified by:
computeScoreForExamples
in interfaceIOutputLayer
- Parameters:
fullNetRegTerm
- Regularization score (l1/l2/weight decay) for the entire network- Returns:
- A column INDArray of shape [numExamples,1], where entry i is the score of the ith example
-
f1Score
public double f1Score(DataSet data)
Description copied from interface:Classifier
Sets the input and labels and returns a score for the prediction wrt true labels- Specified by:
f1Score
in interfaceClassifier
- Parameters:
data
- the data to score- Returns:
- the score for the given input,label pairs
-
f1Score
public double f1Score(INDArray examples, INDArray labels)
Description copied from interface:Classifier
Returns the f1 score for the given examples. Think of this to be like a percentage right. The higher the number the more it got right. This is on a scale from 0 to 1.- Specified by:
f1Score
in interfaceClassifier
- Parameters:
examples
- te the examples to classify (one example in each row)labels
- the true labels- Returns:
- the scores for each ndarray
-
numLabels
public int numLabels()
Description copied from interface:Classifier
Returns the number of possible labels- Specified by:
numLabels
in interfaceClassifier
- Returns:
- the number of possible labels for this classifier
-
fit
public void fit(DataSetIterator iter)
Description copied from interface:Classifier
Train the model based on the datasetiterator- Specified by:
fit
in interfaceClassifier
- Parameters:
iter
- the iterator to train on
-
predict
public int[] predict(INDArray examples)
Description copied from interface:Classifier
Takes in a list of examples For each row, returns a label- Specified by:
predict
in interfaceClassifier
- Parameters:
examples
- the examples to classify (one example in each row)- Returns:
- the labels for each example
-
predict
public List<String> predict(DataSet dataSet)
Description copied from interface:Classifier
Takes in a DataSet of examples For each row, returns a label- Specified by:
predict
in interfaceClassifier
- Parameters:
dataSet
- the examples to classify- Returns:
- the labels for each example
-
fit
public void fit(INDArray examples, INDArray labels)
Description copied from interface:Classifier
Fit the model- Specified by:
fit
in interfaceClassifier
- Parameters:
examples
- the examples to classify (one example in each row)labels
- the example labels(a binary outcome matrix)
-
fit
public void fit(DataSet data)
Description copied from interface:Classifier
Fit the model- Specified by:
fit
in interfaceClassifier
- Parameters:
data
- the data to train on
-
fit
public void fit(INDArray examples, int[] labels)
Description copied from interface:Classifier
Fit the model- Specified by:
fit
in interfaceClassifier
- Parameters:
examples
- the examples to classify (one example in each row)labels
- the labels for each example (the number of labels must match the number of rows in the example
-
-