public class Bidirectional extends Layer
Bidirectional.Mode
javadoc for more details..layer(new Bidirectional(new LSTM.Builder()....build())
Modifier and Type | Class and Description |
---|---|
static class |
Bidirectional.Builder |
static class |
Bidirectional.Mode
This Mode enumeration defines how the activations for the forward and backward networks should be combined.
ADD: out = forward + backward (elementwise addition) MUL: out = forward * backward (elementwise multiplication) AVERAGE: out = 0.5 * (forward + backward) CONCAT: Concatenate the activations. Where 'forward' is the activations for the forward RNN, and 'backward' is the activations for the backward RNN. |
constraints, iDropout, layerName
Constructor and Description |
---|
Bidirectional(Bidirectional.Mode mode,
Layer layer)
Create a Bidirectional wrapper for the specified layer
|
Bidirectional(Layer layer)
Create a Bidirectional wrapper, with the default Mode (CONCAT) for the specified layer
|
Modifier and Type | Method and Description |
---|---|
GradientNormalization |
getGradientNormalization() |
double |
getGradientNormalizationThreshold() |
double |
getL1ByParam(String paramName)
Get the L1 coefficient for the given parameter.
|
double |
getL2ByParam(String paramName)
Get the L2 coefficient for the given parameter.
|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
long |
getNIn() |
long |
getNOut() |
InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor |
IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams) |
boolean |
isPretrain() |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setLayerName(String layerName) |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type
|
clone, initializeConstraints, resetLayerDefaultConfig, setPretrain
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
getLayerName
public Bidirectional(@NonNull Layer layer)
layer
- layer to wrappublic Bidirectional(@NonNull Bidirectional.Mode mode, @NonNull Layer layer)
mode
- Mode to use to combine activations. See Bidirectional.Mode
for detailslayer
- layer to wrappublic long getNOut()
public long getNIn()
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams)
instantiate
in class Layer
public ParamInitializer initializer()
initializer
in class Layer
public InputType getOutputType(int layerIndex, InputType inputType)
Layer
getOutputType
in class Layer
layerIndex
- Index of the layerinputType
- Type of input for the layerpublic void setNIn(InputType inputType, boolean override)
Layer
public InputPreProcessor getPreProcessorForInputType(InputType inputType)
Layer
InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor
getPreProcessorForInputType
in class Layer
inputType
- InputType to this layerpublic boolean isPretrain()
public double getL1ByParam(String paramName)
Layer
getL1ByParam
in interface TrainingConfig
getL1ByParam
in class Layer
paramName
- Parameter namepublic double getL2ByParam(String paramName)
Layer
getL2ByParam
in interface TrainingConfig
getL2ByParam
in class Layer
paramName
- Parameter namepublic boolean isPretrainParam(String paramName)
Layer
isPretrainParam
in interface TrainingConfig
isPretrainParam
in class Layer
paramName
- Parameter name/keypublic IUpdater getUpdaterByParam(String paramName)
getUpdaterByParam
in interface TrainingConfig
getUpdaterByParam
in class Layer
paramName
- Parameter namepublic GradientNormalization getGradientNormalization()
public double getGradientNormalizationThreshold()
public void setLayerName(String layerName)
public LayerMemoryReport getMemoryReport(InputType inputType)
Layer
getMemoryReport
in class Layer
inputType
- Input type to the layer. Memory consumption is often a function of the input typeCopyright © 2018. All rights reserved.