public interface Layer extends Serializable, Cloneable, Model
Modifier and Type | Interface and Description |
---|---|
static class |
Layer.TrainingMode |
static class |
Layer.Type |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.api.ndarray.INDArray |
activate()
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input)
Initialize the layer with the given input
and return the activation for this layer
given this input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input,
boolean training)
Initialize the layer with the given input
and return the activation for this layer
given this input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input,
Layer.TrainingMode training)
Initialize the layer with the given input
and return the activation for this layer
given this input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(Layer.TrainingMode training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropOnlyParams)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropOnlyParams)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
clearNoiseWeightParams() |
Layer |
clone()
Clone the layer
|
org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> |
feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in in the layer as appropriate.
|
int |
getEpochCount() |
int |
getIndex()
Get the layer index.
|
int |
getInputMiniBatchSize()
Get current/last input mini-batch size, as set by setInputMiniBatchSize(int)
|
int |
getIterationCount() |
Collection<IterationListener> |
getListeners()
Get the iteration listeners for this layer.
|
org.nd4j.linalg.api.ndarray.INDArray |
getMaskArray() |
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
void |
migrateInput()
For use with ND4J workspaces.
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x)
Raw activations
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
Raw activations
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
Layer.TrainingMode training)
Raw activations
|
void |
setCacheMode(CacheMode mode)
This method sets given CacheMode for current layer
|
void |
setEpochCount(int epochCount)
Set the current epoch count (number of epochs passed ) for the layer/network
|
void |
setIndex(int index)
Set the layer index.
|
void |
setInput(org.nd4j.linalg.api.ndarray.INDArray input)
Get the layer input.
|
void |
setInputMiniBatchSize(int size)
Set current/last input mini-batch size.
Used for score and gradient calculations. |
void |
setIterationCount(int iterationCount)
Set the current iteration count (number of parameter updates) for the layer/network
|
void |
setListeners(Collection<IterationListener> listeners)
Set the iteration listeners for this layer.
|
void |
setListeners(IterationListener... listeners)
Set the iteration listeners for this layer.
|
void |
setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
Set the mask array.
|
Layer |
transpose()
Deprecated.
|
Layer.Type |
type()
Returns the layer type
|
accumulateScore, addListeners, applyConstraints, batchSize, clear, computeGradientAndScore, conf, fit, fit, getGradientsViewArray, getOptimizer, getParam, gradient, gradientAndScore, init, initParams, input, iterate, numParams, numParams, params, paramTable, paramTable, score, setBackpropGradientsViewArray, setConf, setParam, setParams, setParamsViewArray, setParamTable, update, update, validateInput
void setCacheMode(CacheMode mode)
mode
- double calcL2(boolean backpropOnlyParams)
backpropOnlyParams
- If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)double calcL1(boolean backpropOnlyParams)
backpropOnlyParams
- If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)Layer.Type type()
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x)
x
- the input to transformorg.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x, Layer.TrainingMode training)
x
- the input to transformorg.nd4j.linalg.api.ndarray.INDArray activate(Layer.TrainingMode training)
training
- training or test modeorg.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input, Layer.TrainingMode training)
input
- the input to usetraining
- train or test modeorg.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x, boolean training)
x
- the input to transformorg.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
training
- training or test modeorg.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input, boolean training)
input
- the input to usetraining
- train or test modeorg.nd4j.linalg.api.ndarray.INDArray activate()
org.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input)
input
- the input to use@Deprecated Layer transpose()
Layer clone()
Collection<IterationListener> getListeners()
void setListeners(IterationListener... listeners)
setListeners
in interface Model
void setListeners(Collection<IterationListener> listeners)
setListeners
in interface Model
void setIndex(int index)
int getIndex()
int getIterationCount()
int getEpochCount()
void setIterationCount(int iterationCount)
void setEpochCount(int epochCount)
void setInput(org.nd4j.linalg.api.ndarray.INDArray input)
void migrateInput()
void setInputMiniBatchSize(int size)
int getInputMiniBatchSize()
setInputMiniBatchSize(int)
void setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
feedForwardMaskArray(INDArray, MaskState, int)
should be used in
preference to this.maskArray
- Mask array to setorg.nd4j.linalg.api.ndarray.INDArray getMaskArray()
boolean isPretrainLayer()
void clearNoiseWeightParams()
org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray, MaskState currentMaskState, int minibatchSize)
maskArray
- Mask array to setcurrentMaskState
- Current state of the mask - see MaskState
minibatchSize
- Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)Copyright © 2018. All rights reserved.