public class SameDiffLayer extends AbstractLayer<AbstractSameDiffLayer>
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected ExternalErrorsFunction |
fn |
protected INDArray |
gradients |
protected Map<String,INDArray> |
gradTable |
static String |
INPUT_KEY |
static String |
MASK_KEY |
protected String |
outputKey |
protected SDVariable |
outputVar |
protected INDArray |
params |
protected Map<String,INDArray> |
paramTable |
protected SameDiff |
sameDiff |
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners
Constructor and Description |
---|
SameDiffLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.buffer.DataType dataType) |
Modifier and Type | Method and Description |
---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
void |
clearNoiseWeightParams() |
Layer |
clone() |
protected void |
doInit() |
Pair<INDArray,MaskState> |
feedForwardMaskArray(INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in the layer as appropriate.
|
INDArray |
getGradientsViewArray() |
INDArray |
getParam(String param)
Get the parameter
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
long |
numParams()
The number of parameters for the model
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Map<String,INDArray> |
paramTable()
The param table
|
Map<String,INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
void |
setBackpropGradientsViewArray(INDArray gradients)
Set the gradients array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParam(String key,
INDArray val)
Set the parameter with a new ndarray
|
void |
setParams(INDArray params)
Set the parameters for this model.
|
protected void |
setParams(INDArray params,
char order) |
void |
setParamsViewArray(INDArray params)
Set the initial parameters array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParamTable(Map<String,INDArray> paramTable)
Setter for the param table
|
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, calcRegularizationScore, clear, computeGradientAndScore, conf, fit, fit, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, input, layerConf, layerId, numParams, score, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, update, update, updaterDivideByMinibatch
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getIterationCount, setIterationCount
public static final String INPUT_KEY
public static final String MASK_KEY
protected SameDiff sameDiff
protected SDVariable outputVar
protected ExternalErrorsFunction fn
protected String outputKey
protected INDArray params
protected INDArray gradients
public SameDiffLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.buffer.DataType dataType)
public boolean isPretrainLayer()
Layer
public void clearNoiseWeightParams()
public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic INDArray params()
params
in interface Model
params
in interface Trainable
params
in class AbstractLayer<AbstractSameDiffLayer>
public INDArray getParam(String param)
Model
getParam
in interface Model
getParam
in class AbstractLayer<AbstractSameDiffLayer>
param
- the key of the parameterpublic long numParams()
AbstractLayer
numParams
in interface Model
numParams
in interface Trainable
numParams
in class AbstractLayer<AbstractSameDiffLayer>
public void setParam(String key, INDArray val)
Model
setParam
in interface Model
setParam
in class AbstractLayer<AbstractSameDiffLayer>
key
- the key to se tval
- the new ndarraypublic void setParams(INDArray params)
Model
setParams
in interface Model
setParams
in class AbstractLayer<AbstractSameDiffLayer>
params
- the parameters for the modelprotected void setParams(INDArray params, char order)
setParams
in class AbstractLayer<AbstractSameDiffLayer>
public void setParamsViewArray(INDArray params)
Model
setParamsViewArray
in interface Model
setParamsViewArray
in class AbstractLayer<AbstractSameDiffLayer>
params
- a 1 x nParams row vector that is a view of the larger (MLN/CG) parameters arraypublic INDArray getGradientsViewArray()
getGradientsViewArray
in interface Model
getGradientsViewArray
in interface Trainable
getGradientsViewArray
in class AbstractLayer<AbstractSameDiffLayer>
public void setBackpropGradientsViewArray(INDArray gradients)
Model
setBackpropGradientsViewArray
in interface Model
setBackpropGradientsViewArray
in class AbstractLayer<AbstractSameDiffLayer>
gradients
- a 1 x nParams row vector that is a view of the larger (MLN/CG) gradients arraypublic void setParamTable(Map<String,INDArray> paramTable)
Model
setParamTable
in interface Model
setParamTable
in class AbstractLayer<AbstractSameDiffLayer>
public Map<String,INDArray> paramTable()
Model
paramTable
in interface Model
paramTable
in class AbstractLayer<AbstractSameDiffLayer>
public Map<String,INDArray> paramTable(boolean backpropParamsOnly)
Model
paramTable
in interface Model
paramTable
in interface Trainable
paramTable
in class AbstractLayer<AbstractSameDiffLayer>
backpropParamsOnly
- If true, return backprop params only. If false: return all params (equivalent to
paramsTable())protected void doInit()
public Pair<INDArray,MaskState> feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState, int minibatchSize)
Layer
feedForwardMaskArray
in interface Layer
feedForwardMaskArray
in class AbstractLayer<AbstractSameDiffLayer>
maskArray
- Mask array to setcurrentMaskState
- Current state of the mask - see MaskState
minibatchSize
- Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)Copyright © 2019. All rights reserved.