public class SameDiffOutputLayer extends AbstractLayer<SameDiffOutputLayer> implements IOutputLayer
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected INDArray |
gradients |
protected Map<String,INDArray> |
gradTable |
static String |
INPUT_KEY |
protected INDArray |
labels |
static String |
LABELS_KEY |
protected String |
outputKey |
protected SDVariable |
outputVar |
protected INDArray |
params |
protected Map<String,INDArray> |
paramTable |
protected SameDiff |
sameDiff |
cacheMode, conf, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
SameDiffOutputLayer(NeuralNetConfiguration conf) |
Modifier and Type | Method and Description |
---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
void |
clearNoiseWeightParams() |
Layer |
clone() |
double |
computeScore(double fullNetworkL1,
double fullNetworkL2,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.
|
INDArray |
computeScoreForExamples(double fullNetworkL1,
double fullNetworkL2,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
protected void |
doInit() |
double |
f1Score(DataSet data)
Sets the input and labels and returns a score for the prediction
wrt true labels
|
double |
f1Score(INDArray examples,
INDArray labels)
Returns the f1 score for the given examples.
|
void |
fit(DataSet data)
Fit the model
|
void |
fit(DataSetIterator iter)
Train the model based on the datasetiterator
|
void |
fit(INDArray examples,
INDArray labels)
Fit the model
|
void |
fit(INDArray examples,
int[] labels)
Fit the model
|
INDArray |
getGradientsViewArray() |
INDArray |
getParam(String param)
Get the parameter
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
INDArray |
labelProbabilities(INDArray examples)
Returns the probabilities for each label
for each example row wise
|
boolean |
needsLabels()
Returns true if labels are required
for this output layer
|
int |
numLabels()
Returns the number of possible labels
|
int |
numParams()
The number of parameters for the model
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Map<String,INDArray> |
paramTable()
The param table
|
Map<String,INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
List<String> |
predict(DataSet dataSet)
Takes in a DataSet of examples
For each row, returns a label
|
int[] |
predict(INDArray examples)
Takes in a list of examples
For each row, returns a label
|
void |
setBackpropGradientsViewArray(INDArray gradients)
Set the gradients array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParam(String key,
INDArray val)
Set the parameter with a new ndarray
|
void |
setParams(INDArray params)
Set the parameters for this model.
|
protected void |
setParams(INDArray params,
char order) |
void |
setParamsViewArray(INDArray params)
Set the initial parameters array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParamTable(Map<String,INDArray> paramTable)
Setter for the param table
|
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, calcL1, calcL2, clear, computeGradientAndScore, conf, feedForwardMaskArray, fit, fit, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, input, layerConf, layerId, numParams, score, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, update, update
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getLabels, setLabels
activate, allowInputModification, calcL1, calcL2, feedForwardMaskArray, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArray, type
addListeners, applyConstraints, batchSize, clear, computeGradientAndScore, conf, fit, fit, getOptimizer, gradient, gradientAndScore, init, input, numParams, score, setConf, update, update
public static final String INPUT_KEY
public static final String LABELS_KEY
protected SameDiff sameDiff
protected SDVariable outputVar
protected String outputKey
protected INDArray labels
protected INDArray params
protected INDArray gradients
public SameDiffOutputLayer(NeuralNetConfiguration conf)
public boolean isPretrainLayer()
Layer
isPretrainLayer
in interface Layer
public void clearNoiseWeightParams()
clearNoiseWeightParams
in interface Layer
public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
activate
in interface Layer
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic INDArray params()
params
in interface Model
params
in interface Trainable
params
in class AbstractLayer<SameDiffOutputLayer>
public INDArray getParam(String param)
Model
getParam
in interface Model
getParam
in class AbstractLayer<SameDiffOutputLayer>
param
- the key of the parameterpublic int numParams()
AbstractLayer
numParams
in interface Model
numParams
in interface Trainable
numParams
in class AbstractLayer<SameDiffOutputLayer>
public void setParam(String key, INDArray val)
Model
setParam
in interface Model
setParam
in class AbstractLayer<SameDiffOutputLayer>
key
- the key to se tval
- the new ndarraypublic void setParams(INDArray params)
Model
setParams
in interface Model
setParams
in class AbstractLayer<SameDiffOutputLayer>
params
- the parameters for the modelprotected void setParams(INDArray params, char order)
setParams
in class AbstractLayer<SameDiffOutputLayer>
public void setParamsViewArray(INDArray params)
Model
setParamsViewArray
in interface Model
setParamsViewArray
in class AbstractLayer<SameDiffOutputLayer>
params
- a 1 x nParams row vector that is a view of the larger (MLN/CG) parameters arraypublic INDArray getGradientsViewArray()
getGradientsViewArray
in interface Model
getGradientsViewArray
in interface Trainable
getGradientsViewArray
in class AbstractLayer<SameDiffOutputLayer>
public void setBackpropGradientsViewArray(INDArray gradients)
Model
setBackpropGradientsViewArray
in interface Model
setBackpropGradientsViewArray
in class AbstractLayer<SameDiffOutputLayer>
gradients
- a 1 x nParams row vector that is a view of the larger (MLN/CG) gradients arraypublic void setParamTable(Map<String,INDArray> paramTable)
Model
setParamTable
in interface Model
setParamTable
in class AbstractLayer<SameDiffOutputLayer>
public Map<String,INDArray> paramTable()
Model
paramTable
in interface Model
paramTable
in class AbstractLayer<SameDiffOutputLayer>
public Map<String,INDArray> paramTable(boolean backpropParamsOnly)
Model
paramTable
in interface Model
paramTable
in interface Trainable
paramTable
in class AbstractLayer<SameDiffOutputLayer>
backpropParamsOnly
- If true, return backprop params only. If false: return all params (equivalent to
paramsTable())protected void doInit()
public boolean needsLabels()
IOutputLayer
needsLabels
in interface IOutputLayer
public double computeScore(double fullNetworkL1, double fullNetworkL2, boolean training, LayerWorkspaceMgr workspaceMgr)
IOutputLayer
computeScore
in interface IOutputLayer
fullNetworkL1
- L1 regularization term for the entire networkfullNetworkL2
- L2 regularization term for the entire networktraining
- whether score should be calculated at train or test time (this affects things like application of
dropout, etc)public INDArray computeScoreForExamples(double fullNetworkL1, double fullNetworkL2, LayerWorkspaceMgr workspaceMgr)
IOutputLayer
computeScoreForExamples
in interface IOutputLayer
fullNetworkL1
- L1 regularization term for the entire network (or, 0.0 to not include regularization)fullNetworkL2
- L2 regularization term for the entire network (or, 0.0 to not include regularization)public double f1Score(DataSet data)
Classifier
f1Score
in interface Classifier
data
- the data to scorepublic double f1Score(INDArray examples, INDArray labels)
Classifier
f1Score
in interface Classifier
examples
- te the examples to classify (one example in each row)labels
- the true labelspublic int numLabels()
Classifier
numLabels
in interface Classifier
public void fit(DataSetIterator iter)
Classifier
fit
in interface Classifier
iter
- the iterator to train onpublic int[] predict(INDArray examples)
Classifier
predict
in interface Classifier
examples
- the examples to classify (one example in each row)public List<String> predict(DataSet dataSet)
Classifier
predict
in interface Classifier
dataSet
- the examples to classifypublic INDArray labelProbabilities(INDArray examples)
Classifier
labelProbabilities
in interface Classifier
examples
- the examples to classify (one example in each row)public void fit(INDArray examples, INDArray labels)
Classifier
fit
in interface Classifier
examples
- the examples to classify (one example in each row)labels
- the example labels(a binary outcome matrix)public void fit(DataSet data)
Classifier
fit
in interface Classifier
data
- the data to train onpublic void fit(INDArray examples, int[] labels)
Classifier
fit
in interface Classifier
examples
- the examples to classify (one example in each row)labels
- the labels for each example (the number of labels must match
the number of rows in the exampleCopyright © 2018. All rights reserved.