Package org.nd4j.linalg.factory.ops
Class NDRNN
- java.lang.Object
-
- org.nd4j.linalg.factory.ops.NDRNN
-
public class NDRNN extends Object
-
-
Constructor Summary
Constructors Constructor Description NDRNN()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description INDArray
gru(INDArray x, INDArray hLast, INDArray Wx, INDArray Wh, INDArray biases)
The GRU operation.INDArray[]
gruCell(INDArray x, INDArray hLast, GRUWeights GRUWeights)
The GRU cell.INDArray
lstmblock(INDArray maxTSLength, INDArray x, INDArray cLast, INDArray yLast, LSTMWeights LSTMWeights, LSTMConfiguration LSTMConfiguration)
The LSTM blockINDArray
lstmblock(INDArray x, LSTMWeights LSTMWeights, LSTMConfiguration LSTMConfiguration)
The LSTM blockINDArray[]
lstmCell(INDArray x, INDArray cLast, INDArray yLast, LSTMWeights LSTMWeights, LSTMConfiguration LSTMConfiguration)
The LSTM cell.INDArray[]
lstmLayer(INDArray x, INDArray cLast, INDArray yLast, INDArray maxTSLength, LSTMLayerWeights LSTMLayerWeights, LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats:
for unidirectional:
TNS: shapes [timeLength, numExamples, inOutSize]
NST: shapes [numExamples, inOutSize, timeLength]
NTS: shapes [numExamples, timeLength, inOutSize]
for bidirectional:
T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX)
SUPPORTS following direction modes:
FWD: forward
BWD: backward
BIDIR_SUM: bidirectional sum
BIDIR_CONCAT: bidirectional concat
BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS)
You may use different gate configurations:
specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum
("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS")
Also this layer supports MKLDNN (DNNL) and cuDNN accelerationINDArray[]
lstmLayer(INDArray x, LSTMLayerWeights LSTMLayerWeights, LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats:
for unidirectional:
TNS: shapes [timeLength, numExamples, inOutSize]
NST: shapes [numExamples, inOutSize, timeLength]
NTS: shapes [numExamples, timeLength, inOutSize]
for bidirectional:
T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX)
SUPPORTS following direction modes:
FWD: forward
BWD: backward
BIDIR_SUM: bidirectional sum
BIDIR_CONCAT: bidirectional concat
BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS)
You may use different gate configurations:
specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum
("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS")
Also this layer supports MKLDNN (DNNL) and cuDNN accelerationINDArray
sru(INDArray x, INDArray initialC, INDArray mask, SRUWeights SRUWeights)
The SRU layer.INDArray
sru(INDArray x, INDArray initialC, SRUWeights SRUWeights)
The SRU layer.INDArray
sruCell(INDArray x, INDArray cLast, SRUWeights SRUWeights)
The SRU layer.
-
-
-
Method Detail
-
gru
public INDArray gru(INDArray x, INDArray hLast, INDArray Wx, INDArray Wh, INDArray biases)
The GRU operation. Gated Recurrent Unit - Cho et al. 2014.- Parameters:
x
- input [time, bS, nIn] (NUMERIC type)hLast
- initial cell output (at time step = 0) [bS, nOut] (NUMERIC type)Wx
- input-to-hidden weights, [nIn, 3*nOut] (NUMERIC type)Wh
- hidden-to-hidden weights, [nOut, 3*nOut] (NUMERIC type)biases
- biases, [3*nOut] (NUMERIC type)- Returns:
- h cell outputs [time, bS, nOut], that is per each time step (NUMERIC type)
-
gruCell
public INDArray[] gruCell(INDArray x, INDArray hLast, GRUWeights GRUWeights)
The GRU cell. Does a single time step operation- Parameters:
x
- Input, with shape [batchSize, inSize] (NUMERIC type)hLast
- Output of the previous cell/time step, with shape [batchSize, numUnits] (NUMERIC type)GRUWeights
- Configuration Object
-
lstmCell
public INDArray[] lstmCell(INDArray x, INDArray cLast, INDArray yLast, LSTMWeights LSTMWeights, LSTMConfiguration LSTMConfiguration)
The LSTM cell. Does a single time step operation.- Parameters:
x
- Input, with shape [batchSize, inSize] (NUMERIC type)cLast
- Previous cell state, with shape [batchSize, numUnits] (NUMERIC type)yLast
- revious cell output, with shape [batchSize, numUnits] (NUMERIC type)LSTMWeights
- Configuration ObjectLSTMConfiguration
- Configuration Object
-
lstmLayer
public INDArray[] lstmLayer(INDArray x, INDArray cLast, INDArray yLast, INDArray maxTSLength, LSTMLayerWeights LSTMLayerWeights, LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats:
for unidirectional:
TNS: shapes [timeLength, numExamples, inOutSize]
NST: shapes [numExamples, inOutSize, timeLength]
NTS: shapes [numExamples, timeLength, inOutSize]
for bidirectional:
T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX)
SUPPORTS following direction modes:
FWD: forward
BWD: backward
BIDIR_SUM: bidirectional sum
BIDIR_CONCAT: bidirectional concat
BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS)
You may use different gate configurations:
specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum
("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS")
Also this layer supports MKLDNN (DNNL) and cuDNN acceleration- Parameters:
x
- Input, with shape dependent on the data format (in config). (NUMERIC type)cLast
- Previous/initial cell state, with shape [batchSize, numUnits] (NUMERIC type)yLast
- Previous/initial cell output, with shape [batchSize, numUnits] (NUMERIC type)maxTSLength
- maxTSLength with shape [batchSize] (NUMERIC type)LSTMLayerWeights
- Configuration ObjectLSTMLayerConfig
- Configuration Object
-
lstmLayer
public INDArray[] lstmLayer(INDArray x, LSTMLayerWeights LSTMLayerWeights, LSTMLayerConfig LSTMLayerConfig)
Long Short-Term Memory layer - Hochreiter 1997.
SUPPORTS following data formats:
for unidirectional:
TNS: shapes [timeLength, numExamples, inOutSize]
NST: shapes [numExamples, inOutSize, timeLength]
NTS: shapes [numExamples, timeLength, inOutSize]
for bidirectional:
T2NS: shapes [timeLength, 2, numExamples, inOutSize] (for ONNX)
SUPPORTS following direction modes:
FWD: forward
BWD: backward
BIDIR_SUM: bidirectional sum
BIDIR_CONCAT: bidirectional concat
BIDIR_EXTRA_DIM: bidirectional extra output dim (in conjunction with format dataFormat - T2NS)
You may use different gate configurations:
specify gate/cell/out aplha/beta and numbers of activations for gate/cell/out described in activations enum
("RELU","SIGMOID","AFFINE","LEAKY_RELU","THRESHHOLD_RELU","SCALED_TAHN","HARD_SIGMOID","ELU","SOFTSIGN","SOFTPLUS")
Also this layer supports MKLDNN (DNNL) and cuDNN acceleration- Parameters:
x
- Input, with shape dependent on the data format (in config). (NUMERIC type)LSTMLayerWeights
- Configuration ObjectLSTMLayerConfig
- Configuration Object
-
lstmblock
public INDArray lstmblock(INDArray maxTSLength, INDArray x, INDArray cLast, INDArray yLast, LSTMWeights LSTMWeights, LSTMConfiguration LSTMConfiguration)
The LSTM block- Parameters:
maxTSLength
- (NUMERIC type)x
- Input, with shape dependent on the data format (in config). (NUMERIC type)cLast
- Previous/initial cell state, with shape [batchSize, numUnits] (NUMERIC type)yLast
- Previous/initial cell output, with shape [batchSize, numUnits] (NUMERIC type)LSTMWeights
- Configuration ObjectLSTMConfiguration
- Configuration Object- Returns:
- output The layer's outputs. (NUMERIC type)
-
lstmblock
public INDArray lstmblock(INDArray x, LSTMWeights LSTMWeights, LSTMConfiguration LSTMConfiguration)
The LSTM block- Parameters:
x
- Input, with shape dependent on the data format (in config). (NUMERIC type)LSTMWeights
- Configuration ObjectLSTMConfiguration
- Configuration Object- Returns:
- output The layer's outputs. (NUMERIC type)
-
sru
public INDArray sru(INDArray x, INDArray initialC, INDArray mask, SRUWeights SRUWeights)
The SRU layer. Does a single time step operation.- Parameters:
x
- Input, with shape [batchSize, inSize] (NUMERIC type)initialC
- Initial cell state, with shape [batchSize, inSize] (NUMERIC type)mask
- An optional dropout mask, with shape [batchSize, inSize] (NUMERIC type)SRUWeights
- Configuration Object- Returns:
- output The cell's outputs.. (NUMERIC type)
-
sru
public INDArray sru(INDArray x, INDArray initialC, SRUWeights SRUWeights)
The SRU layer. Does a single time step operation.- Parameters:
x
- Input, with shape [batchSize, inSize] (NUMERIC type)initialC
- Initial cell state, with shape [batchSize, inSize] (NUMERIC type)SRUWeights
- Configuration Object- Returns:
- output The cell's outputs.. (NUMERIC type)
-
sruCell
public INDArray sruCell(INDArray x, INDArray cLast, SRUWeights SRUWeights)
The SRU layer. Does a single time step operation.- Parameters:
x
- Input, with shape [batchSize, inSize] (NUMERIC type)cLast
- Previous cell state, with shape [batchSize, inSize] (NUMERIC type)SRUWeights
- Configuration Object- Returns:
- output The cell's outputs. (NUMERIC type)
-
-