public class BidirectionalLayer extends Object implements RecurrentLayer
Bidirectional.Mode
javadoc for more details..layer(new Bidirectional(new LSTM.Builder()....build())
Layer.TrainingMode, Layer.Type
Constructor and Description |
---|
BidirectionalLayer(NeuralNetConfiguration conf,
RecurrentLayer fwd,
RecurrentLayer bwd) |
Modifier and Type | Method and Description |
---|---|
void |
accumulateScore(double accum)
Sets a rolling tally for the score.
|
org.nd4j.linalg.api.ndarray.INDArray |
activate()
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training)
Trigger an activation with the last specified input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input)
Initialize the layer with the given input
and return the activation for this layer
given this input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input,
boolean training)
Initialize the layer with the given input
and return the activation for this layer
given this input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(org.nd4j.linalg.api.ndarray.INDArray input,
Layer.TrainingMode training)
Initialize the layer with the given input
and return the activation for this layer
given this input
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(Layer.TrainingMode training)
Trigger an activation with the last specified input
|
void |
addListeners(IterationListener... listener)
This method ADDS additional IterationListener to existing listeners
|
void |
applyConstraints(int iteration,
int epoch)
Apply any constraints to the model
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
int |
batchSize()
The current inputs batch size
|
double |
calcL1(boolean backpropOnlyParams)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropOnlyParams)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
clear()
Clear input
|
void |
clearNoiseWeightParams() |
Layer |
clone()
Clone the layer
|
void |
computeGradientAndScore()
Update the score
|
NeuralNetConfiguration |
conf()
The configuration for the neural network
|
org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> |
feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in in the layer as appropriate.
|
void |
fit()
All models have a fit method
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray data)
Fit the model to the given data
|
int |
getEpochCount() |
org.nd4j.linalg.api.ndarray.INDArray |
getGradientsViewArray() |
int |
getIndex()
Get the layer index.
|
int |
getInputMiniBatchSize()
Get current/last input mini-batch size, as set by setInputMiniBatchSize(int)
|
int |
getIterationCount() |
Collection<IterationListener> |
getListeners()
Get the iteration listeners for this layer.
|
org.nd4j.linalg.api.ndarray.INDArray |
getMaskArray() |
ConvexOptimizer |
getOptimizer()
Returns this models optimizer
|
org.nd4j.linalg.api.ndarray.INDArray |
getParam(String param)
Get the parameter
|
Gradient |
gradient()
Get the gradient.
|
org.nd4j.linalg.primitives.Pair<Gradient,Double> |
gradientAndScore()
Get the gradient and score
|
void |
init()
Init the model
|
void |
initParams()
Initialize the parameters
|
org.nd4j.linalg.api.ndarray.INDArray |
input()
The input/feature matrix for the model
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
void |
iterate(org.nd4j.linalg.api.ndarray.INDArray input)
Run one iteration
|
void |
migrateInput()
For use with ND4J workspaces.
|
int |
numParams()
the number of parameters for the model
|
int |
numParams(boolean backwards)
the number of parameters for the model
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Parameters of the model (if any)
|
Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
paramTable()
The param table
|
Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x)
Raw activations
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
boolean training)
Raw activations
|
org.nd4j.linalg.api.ndarray.INDArray |
preOutput(org.nd4j.linalg.api.ndarray.INDArray x,
Layer.TrainingMode training)
Raw activations
|
org.nd4j.linalg.api.ndarray.INDArray |
rnnActivateUsingStoredState(org.nd4j.linalg.api.ndarray.INDArray input,
boolean training,
boolean storeLastForTBPTT)
Similar to rnnTimeStep, this method is used for activations using the state
stored in the stateMap as the initialization.
|
void |
rnnClearPreviousState()
Reset/clear the stateMap for rnnTimeStep() and tBpttStateMap for rnnActivateUsingStoredState()
|
Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
rnnGetPreviousState()
Returns a shallow copy of the RNN stateMap (that contains the stored history for use in methods such
as rnnTimeStep
|
Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
rnnGetTBPTTState()
Get the RNN truncated backpropagations through time (TBPTT) state for the recurrent layer.
|
void |
rnnSetPreviousState(Map<String,org.nd4j.linalg.api.ndarray.INDArray> stateMap)
Set the stateMap (stored history).
|
void |
rnnSetTBPTTState(Map<String,org.nd4j.linalg.api.ndarray.INDArray> state)
Set the RNN truncated backpropagations through time (TBPTT) state for the recurrent layer.
|
org.nd4j.linalg.api.ndarray.INDArray |
rnnTimeStep(org.nd4j.linalg.api.ndarray.INDArray input)
Do one or more time steps using the previous time step state stored in stateMap.
Can be used to efficiently do forward pass one or n-steps at a time (instead of doing forward pass always from t=0) If stateMap is empty, default initialization (usually zeros) is used Implementations also update stateMap at the end of this method |
double |
score()
The score for the model
|
void |
setBackpropGradientsViewArray(org.nd4j.linalg.api.ndarray.INDArray gradients)
Set the gradients array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setCacheMode(CacheMode mode)
This method sets given CacheMode for current layer
|
void |
setConf(NeuralNetConfiguration conf)
Setter for the configuration
|
void |
setEpochCount(int epochCount)
Set the current epoch count (number of epochs passed ) for the layer/network
|
void |
setIndex(int index)
Set the layer index.
|
void |
setInput(org.nd4j.linalg.api.ndarray.INDArray input)
Get the layer input.
|
void |
setInputMiniBatchSize(int size)
Set current/last input mini-batch size.
Used for score and gradient calculations. |
void |
setIterationCount(int iterationCount)
Set the current iteration count (number of parameter updates) for the layer/network
|
void |
setListeners(Collection<IterationListener> listeners)
Set the iteration listeners for this layer.
|
void |
setListeners(IterationListener... listeners)
Set the iteration listeners for this layer.
|
void |
setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
Set the mask array.
|
void |
setParam(String key,
org.nd4j.linalg.api.ndarray.INDArray val)
Set the parameter with a new ndarray
|
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
void |
setParamsViewArray(org.nd4j.linalg.api.ndarray.INDArray params)
Set the initial parameters array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParamTable(Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable)
Setter for the param table
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
tbpttBackpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
int tbpttBackLength)
Truncated BPTT equivalent of Layer.backpropGradient().
|
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
Layer.Type |
type()
Returns the layer type
|
void |
update(Gradient gradient)
Update layer weights and biases with gradient change
|
void |
update(org.nd4j.linalg.api.ndarray.INDArray gradient,
String paramType)
Perform one update applying the gradient
|
void |
validateInput()
Validate the input
|
public BidirectionalLayer(@NonNull NeuralNetConfiguration conf, @NonNull RecurrentLayer fwd, @NonNull RecurrentLayer bwd)
public org.nd4j.linalg.api.ndarray.INDArray rnnTimeStep(org.nd4j.linalg.api.ndarray.INDArray input)
RecurrentLayer
rnnTimeStep
in interface RecurrentLayer
input
- Input to this layerpublic Map<String,org.nd4j.linalg.api.ndarray.INDArray> rnnGetPreviousState()
RecurrentLayer
rnnGetPreviousState
in interface RecurrentLayer
public void rnnSetPreviousState(Map<String,org.nd4j.linalg.api.ndarray.INDArray> stateMap)
RecurrentLayer
rnnSetPreviousState
in interface RecurrentLayer
public void rnnClearPreviousState()
RecurrentLayer
rnnClearPreviousState
in interface RecurrentLayer
public org.nd4j.linalg.api.ndarray.INDArray rnnActivateUsingStoredState(org.nd4j.linalg.api.ndarray.INDArray input, boolean training, boolean storeLastForTBPTT)
RecurrentLayer
rnnActivateUsingStoredState
in interface RecurrentLayer
input
- Layer inputtraining
- if true: training. Otherwise: teststoreLastForTBPTT
- If true: store the final state in tBpttStateMap for use in truncated BPTT trainingpublic Map<String,org.nd4j.linalg.api.ndarray.INDArray> rnnGetTBPTTState()
RecurrentLayer
rnnGetTBPTTState
in interface RecurrentLayer
public void rnnSetTBPTTState(Map<String,org.nd4j.linalg.api.ndarray.INDArray> state)
RecurrentLayer
rnnSetTBPTTState
in interface RecurrentLayer
state
- TBPTT state to setpublic org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> tbpttBackpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, int tbpttBackLength)
RecurrentLayer
tbpttBackpropGradient
in interface RecurrentLayer
public void setCacheMode(CacheMode mode)
Layer
setCacheMode
in interface Layer
public double calcL2(boolean backpropOnlyParams)
Layer
public double calcL1(boolean backpropOnlyParams)
Layer
public Layer.Type type()
Layer
public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x)
Layer
public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x, Layer.TrainingMode training)
Layer
public org.nd4j.linalg.api.ndarray.INDArray activate(Layer.TrainingMode training)
Layer
public org.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input, Layer.TrainingMode training)
Layer
public org.nd4j.linalg.api.ndarray.INDArray preOutput(org.nd4j.linalg.api.ndarray.INDArray x, boolean training)
Layer
public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training)
Layer
public org.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input, boolean training)
Layer
public org.nd4j.linalg.api.ndarray.INDArray activate()
Layer
public org.nd4j.linalg.api.ndarray.INDArray activate(org.nd4j.linalg.api.ndarray.INDArray input)
Layer
public Layer transpose()
Layer
public Collection<IterationListener> getListeners()
Layer
getListeners
in interface Layer
public void setListeners(IterationListener... listeners)
Layer
setListeners
in interface Layer
setListeners
in interface Model
public void addListeners(IterationListener... listener)
Model
addListeners
in interface Model
public void fit()
Model
public void update(Gradient gradient)
Model
public void update(org.nd4j.linalg.api.ndarray.INDArray gradient, String paramType)
Model
public double score()
Model
public void computeGradientAndScore()
Model
computeGradientAndScore
in interface Model
public void accumulateScore(double accum)
Model
accumulateScore
in interface Model
accum
- the amount to accumpublic org.nd4j.linalg.api.ndarray.INDArray params()
Model
public int numParams()
Model
public int numParams(boolean backwards)
Model
public void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Model
public void setParamsViewArray(org.nd4j.linalg.api.ndarray.INDArray params)
Model
setParamsViewArray
in interface Model
params
- a 1 x nParams row vector that is a view of the larger (MLN/CG) parameters arraypublic org.nd4j.linalg.api.ndarray.INDArray getGradientsViewArray()
getGradientsViewArray
in interface Model
public void setBackpropGradientsViewArray(org.nd4j.linalg.api.ndarray.INDArray gradients)
Model
setBackpropGradientsViewArray
in interface Model
gradients
- a 1 x nParams row vector that is a view of the larger (MLN/CG) gradients arraypublic void fit(org.nd4j.linalg.api.ndarray.INDArray data)
Model
public void iterate(org.nd4j.linalg.api.ndarray.INDArray input)
Model
public Gradient gradient()
Model
Model.computeGradientAndScore()
.public org.nd4j.linalg.primitives.Pair<Gradient,Double> gradientAndScore()
Model
gradientAndScore
in interface Model
public int batchSize()
Model
public NeuralNetConfiguration conf()
Model
public void setConf(NeuralNetConfiguration conf)
Model
public org.nd4j.linalg.api.ndarray.INDArray input()
Model
public void validateInput()
Model
validateInput
in interface Model
public ConvexOptimizer getOptimizer()
Model
getOptimizer
in interface Model
public org.nd4j.linalg.api.ndarray.INDArray getParam(String param)
Model
public void initParams()
Model
initParams
in interface Model
public Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable()
Model
paramTable
in interface Model
public Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable(boolean backpropParamsOnly)
Model
paramTable
in interface Model
backpropParamsOnly
- If true, return backprop params only. If false: return all params (equivalent to
paramsTable())public void setParamTable(Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable)
Model
setParamTable
in interface Model
public void setParam(String key, org.nd4j.linalg.api.ndarray.INDArray val)
Model
public void clear()
Model
public void applyConstraints(int iteration, int epoch)
Model
applyConstraints
in interface Model
public void init()
Model
public void setListeners(Collection<IterationListener> listeners)
Layer
setListeners
in interface Layer
setListeners
in interface Model
public void setIndex(int index)
Layer
public int getIndex()
Layer
public int getIterationCount()
getIterationCount
in interface Layer
public int getEpochCount()
getEpochCount
in interface Layer
public void setIterationCount(int iterationCount)
Layer
setIterationCount
in interface Layer
public void setEpochCount(int epochCount)
Layer
setEpochCount
in interface Layer
public void setInput(org.nd4j.linalg.api.ndarray.INDArray input)
Layer
public void migrateInput()
Layer
migrateInput
in interface Layer
public void setInputMiniBatchSize(int size)
Layer
setInputMiniBatchSize
in interface Layer
public int getInputMiniBatchSize()
Layer
getInputMiniBatchSize
in interface Layer
Layer.setInputMiniBatchSize(int)
public void setMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray)
Layer
Layer.feedForwardMaskArray(INDArray, MaskState, int)
should be used in
preference to this.setMaskArray
in interface Layer
maskArray
- Mask array to setpublic org.nd4j.linalg.api.ndarray.INDArray getMaskArray()
getMaskArray
in interface Layer
public boolean isPretrainLayer()
Layer
isPretrainLayer
in interface Layer
public void clearNoiseWeightParams()
clearNoiseWeightParams
in interface Layer
public org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,MaskState> feedForwardMaskArray(org.nd4j.linalg.api.ndarray.INDArray maskArray, MaskState currentMaskState, int minibatchSize)
Layer
feedForwardMaskArray
in interface Layer
maskArray
- Mask array to setcurrentMaskState
- Current state of the mask - see MaskState
minibatchSize
- Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)Copyright © 2018. All rights reserved.