Modifier and Type | Field and Description |
---|---|
protected Map<Integer,InputPreProcessor> |
MultiLayerConfiguration.inputPreProcessors |
protected Map<Integer,InputPreProcessor> |
MultiLayerConfiguration.Builder.inputPreProcessors |
protected Map<String,InputPreProcessor> |
ComputationGraphConfiguration.GraphBuilder.inputPreProcessors |
Modifier and Type | Method and Description |
---|---|
InputPreProcessor |
InputPreProcessor.clone() |
InputPreProcessor |
MultiLayerConfiguration.getInputPreProcess(int curr) |
Modifier and Type | Method and Description |
---|---|
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.addLayer(String layerName,
Layer layer,
InputPreProcessor preProcessor,
String... layerInputs)
Add a layer and an
InputPreProcessor , with the specified name and specified inputs. |
MultiLayerConfiguration.Builder |
MultiLayerConfiguration.Builder.inputPreProcessor(Integer layer,
InputPreProcessor processor)
Specify the processors.
|
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.inputPreProcessor(String layer,
InputPreProcessor processor)
Specify the processors for a given layer
These are used at each layer for doing things like normalization and shaping of input.
Note: preprocessors can also be defined using the ComputationGraphConfiguration.GraphBuilder.addLayer(String, Layer, InputPreProcessor, String...) method. |
Modifier and Type | Method and Description |
---|---|
MultiLayerConfiguration.Builder |
MultiLayerConfiguration.Builder.inputPreProcessors(Map<Integer,InputPreProcessor> processors) |
Constructor and Description |
---|
LayerVertex(NeuralNetConfiguration layerConf,
InputPreProcessor preProcessor) |
PreprocessorVertex(InputPreProcessor preProcessor) |
PreprocessorVertex(InputPreProcessor preProcessor,
InputType outputType)
Deprecated.
This constructor (and the "InputType override" functionality previously used is no longer necessary.
|
Modifier and Type | Class and Description |
---|---|
class |
BaseInputPreProcessor |
class |
BinomialSamplingPreProcessor
Binomial sampling pre processor
|
class |
CnnToFeedForwardPreProcessor
A preprocessor to allow CNN and standard feed-forward network layers to be used together.
For example, CNN -> Denselayer This does two things: (b) Reshapes 4d activations out of CNN layer, with shape [numExamples, numChannels, inputHeight, inputWidth]) into 2d activations (with shape [numExamples, inputHeight*inputWidth*numChannels]) for use in feed forward layer (a) Reshapes epsilons (weights*deltas) out of FeedFoward layer (which is 2D or 3D with shape [numExamples, inputHeight*inputWidth*numChannels]) into 4d epsilons (with shape [numExamples, numChannels, inputHeight, inputWidth]) suitable to feed into CNN layers. Note: numChannels is equivalent to depth or featureMaps referenced in different literature |
class |
CnnToRnnPreProcessor
A preprocessor to allow CNN and RNN layers to be used together.
For example, ConvolutionLayer -> GravesLSTM Functionally equivalent to combining CnnToFeedForwardPreProcessor + FeedForwardToRnnPreProcessor Specifically, this does two things: (a) Reshape 4d activations out of CNN layer, with shape [timeSeriesLength*miniBatchSize, numChannels, inputHeight, inputWidth]) into 3d (time series) activations (with shape [numExamples, inputHeight*inputWidth*numChannels, timeSeriesLength]) for use in RNN layers (b) Reshapes 3d epsilons (weights.*deltas) out of RNN layer (with shape [miniBatchSize,inputHeight*inputWidth*numChannels,timeSeriesLength]) into 4d epsilons with shape [miniBatchSize*timeSeriesLength, numChannels, inputHeight, inputWidth] suitable to feed into CNN layers. |
class |
ComposableInputPreProcessor
Composable input pre processor
|
class |
FeedForwardToCnnPreProcessor
A preprocessor to allow CNN and standard feed-forward network layers to be used together.
For example, DenseLayer -> CNN This does two things: (a) Reshapes activations out of FeedFoward layer (which is 2D or 3D with shape [numExamples, inputHeight*inputWidth*numChannels]) into 4d activations (with shape [numExamples, numChannels, inputHeight, inputWidth]) suitable to feed into CNN layers. (b) Reshapes 4d epsilons (weights*deltas) from CNN layer, with shape [numExamples, numChannels, inputHeight, inputWidth]) into 2d epsilons (with shape [numExamples, inputHeight*inputWidth*numChannels]) for use in feed forward layer Note: numChannels is equivalent to depth or featureMaps referenced in different literature |
class |
FeedForwardToRnnPreProcessor
A preprocessor to allow RNN and feed-forward network layers to be used together.
For example, DenseLayer -> GravesLSTM This does two things: (a) Reshapes activations out of FeedFoward layer (which is 2D with shape [miniBatchSize*timeSeriesLength,layerSize]) into 3d activations (with shape [miniBatchSize,layerSize,timeSeriesLength]) suitable to feed into RNN layers. (b) Reshapes 3d epsilons (weights*deltas from RNN layer, with shape [miniBatchSize,layerSize,timeSeriesLength]) into 2d epsilons (with shape [miniBatchSize*timeSeriesLength,layerSize]) for use in feed forward layer |
class |
ReshapePreProcessor
Deprecated.
|
class |
RnnToCnnPreProcessor
A preprocessor to allow RNN and CNN layers to be used together
For example, time series (video) input -> ConvolutionLayer, or conceivable GravesLSTM -> ConvolutionLayer Functionally equivalent to combining RnnToFeedForwardPreProcessor + FeedForwardToCnnPreProcessor Specifically, this does two things: (a) Reshape 3d activations out of RNN layer, with shape [miniBatchSize, numChannels*inputHeight*inputWidth, timeSeriesLength]) into 4d (CNN) activations (with shape [numExamples*timeSeriesLength, numChannels, inputWidth, inputHeight]) (b) Reshapes 4d epsilons (weights.*deltas) out of CNN layer (with shape [numExamples*timeSeriesLength, numChannels, inputHeight, inputWidth]) into 3d epsilons with shape [miniBatchSize, numChannels*inputHeight*inputWidth, timeSeriesLength] suitable to feed into CNN layers. |
class |
RnnToFeedForwardPreProcessor
A preprocessor to allow RNN and feed-forward network layers to be used together.
For example, GravesLSTM -> OutputLayer or GravesLSTM -> DenseLayer This does two things: (a) Reshapes activations out of RNN layer (which is 3D with shape [miniBatchSize,layerSize,timeSeriesLength]) into 2d activations (with shape [miniBatchSize*timeSeriesLength,layerSize]) suitable for use in feed-forward layers. (b) Reshapes 2d epsilons (weights*deltas from feed forward layer, with shape [miniBatchSize*timeSeriesLength,layerSize]) into 3d epsilons (with shape [miniBatchSize,layerSize,timeSeriesLength]) for use in RNN layer |
class |
UnitVarianceProcessor
Unit variance operation
|
class |
ZeroMeanAndUnitVariancePreProcessor
Zero mean and unit variance operation
|
class |
ZeroMeanPrePreProcessor
Zero mean and unit variance operation
|
Constructor and Description |
---|
ComposableInputPreProcessor(InputPreProcessor... inputPreProcessors) |
Constructor and Description |
---|
LayerVertex(ComputationGraph graph,
String name,
int vertexIndex,
Layer layer,
InputPreProcessor layerPreProcessor,
boolean outputVertex)
Create a network input vertex:
|
LayerVertex(ComputationGraph graph,
String name,
int vertexIndex,
VertexIndices[] inputVertices,
VertexIndices[] outputVertices,
Layer layer,
InputPreProcessor layerPreProcessor,
boolean outputVertex) |
PreprocessorVertex(ComputationGraph graph,
String name,
int vertexIndex,
InputPreProcessor preProcessor) |
PreprocessorVertex(ComputationGraph graph,
String name,
int vertexIndex,
VertexIndices[] inputVertices,
VertexIndices[] outputVertices,
InputPreProcessor preProcessor) |
Copyright © 2016. All Rights Reserved.