public abstract class Layer extends Object implements Serializable, Cloneable
Modifier and Type | Class and Description |
---|---|
static class |
Layer.Builder<T extends Layer.Builder<T>> |
Modifier and Type | Field and Description |
---|---|
protected List<LayerConstraint> |
constraints |
protected IDropout |
iDropout |
protected String |
layerName |
Constructor and Description |
---|
Layer(Layer.Builder builder) |
Modifier and Type | Method and Description |
---|---|
Layer |
clone() |
abstract double |
getL1ByParam(String paramName)
Get the L1 coefficient for the given parameter.
|
abstract double |
getL2ByParam(String paramName)
Get the L2 coefficient for the given parameter.
|
abstract LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
abstract InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
abstract InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor |
org.nd4j.linalg.learning.config.IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
protected void |
initializeConstraints(Layer.Builder<?> builder)
Initialize the weight constraints.
|
abstract ParamInitializer |
initializer() |
abstract Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
org.nd4j.linalg.api.ndarray.INDArray layerParamsView,
boolean initializeParams) |
abstract boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
resetLayerDefaultConfig()
Reset the learning related configs of the layer to default.
|
abstract void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type
|
protected String layerName
protected IDropout iDropout
protected List<LayerConstraint> constraints
public Layer(Layer.Builder builder)
protected void initializeConstraints(Layer.Builder<?> builder)
public void resetLayerDefaultConfig()
public abstract Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, org.nd4j.linalg.api.ndarray.INDArray layerParamsView, boolean initializeParams)
public abstract ParamInitializer initializer()
public abstract InputType getOutputType(int layerIndex, InputType inputType)
layerIndex
- Index of the layerinputType
- Type of input for the layerIllegalStateException
- if input type is invalid for this layerpublic abstract void setNIn(InputType inputType, boolean override)
inputType
- Input type for this layeroverride
- If false: only set the nIn value if it's not already set. If true: set it regardless of whether it's
already set or not.IllegalStateException
- if input type is invalid for this layerpublic abstract InputPreProcessor getPreProcessorForInputType(InputType inputType)
InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor
inputType
- InputType to this layerIllegalStateException
- if input type is invalid for this layerpublic abstract double getL1ByParam(String paramName)
paramName
- Parameter namepublic abstract double getL2ByParam(String paramName)
paramName
- Parameter namepublic abstract boolean isPretrainParam(String paramName)
paramName
- Parameter name/keypublic org.nd4j.linalg.learning.config.IUpdater getUpdaterByParam(String paramName)
paramName
- Parameter namepublic abstract LayerMemoryReport getMemoryReport(InputType inputType)
inputType
- Input type to the layer. Memory consumption is often a function of the input typeCopyright © 2018. All rights reserved.