Modifier and Type | Method and Description |
---|---|
int |
ParamInitializer.numParams(Layer layer) |
Modifier and Type | Field and Description |
---|---|
protected Layer |
NeuralNetConfiguration.layer |
protected Layer |
NeuralNetConfiguration.Builder.layer |
Modifier and Type | Method and Description |
---|---|
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.addLayer(String layerName,
Layer layer,
InputPreProcessor preProcessor,
String... layerInputs)
Add a layer and an
InputPreProcessor , with the specified name and specified inputs. |
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.addLayer(String layerName,
Layer layer,
String... layerInputs)
Add a layer, with no
InputPreProcessor , with the specified name and specified inputs. |
NeuralNetConfiguration.ListBuilder |
NeuralNetConfiguration.ListBuilder.layer(int ind,
Layer layer) |
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.layer(Layer layer)
Layer class.
|
NeuralNetConfiguration.ListBuilder |
NeuralNetConfiguration.Builder.list(Layer... layers)
Create a ListBuilder (for creating a MultiLayerConfiguration) with the specified layers
Usage: |
Modifier and Type | Class and Description |
---|---|
class |
AbstractLSTM
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
class |
ActivationLayer |
class |
AutoEncoder
Autoencoder.
|
class |
BaseLayer
A neural network layer.
|
class |
BaseOutputLayer |
class |
BasePretrainNetwork |
class |
BaseRecurrentLayer |
class |
BatchNormalization
Batch normalization configuration
|
class |
CenterLossOutputLayer
Center loss is similar to triplet loss except that it enforces
intraclass consistency and doesn't require feed forward of multiple
examples.
|
class |
Convolution1DLayer
1D (temporal) convolutional layer.
|
class |
ConvolutionLayer |
class |
DenseLayer
Dense layer: fully connected feed forward layer trainable by backprop.
|
class |
DropoutLayer |
class |
EmbeddingLayer
Embedding layer: feed-forward layer that expects single integers per example as input (class numbers, in range 0 to numClass-1)
as input.
|
class |
FeedForwardLayer
Created by jeffreytang on 7/21/15.
|
class |
GlobalPoolingLayer
Global pooling layer - used to do pooling over time for RNNs, and 2d pooling for CNNs.
Supports the following PoolingType s: SUM, AVG, MAX, PNORMGlobal pooling layer can also handle mask arrays when dealing with variable length inputs. |
class |
GravesBidirectionalLSTM
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
class |
GravesLSTM
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
class |
LocalResponseNormalization
Created by nyghtowl on 10/29/15.
|
class |
LossLayer
LossLayer is a flexible output "layer" that performs a loss function on
an input without MLP logic.
|
class |
LSTM
LSTM recurrent net without peephole connections.
|
class |
OutputLayer
Output layer with different objective co-occurrences for different objectives.
|
class |
RBM
Restricted Boltzmann Machine.
|
class |
RnnOutputLayer |
class |
Subsampling1DLayer
1D (temporal) subsampling layer.
|
class |
SubsamplingLayer
Subsampling layer also referred to as pooling in convolution neural nets
Supports the following pooling types:
MAX
AVG
NON
|
class |
ZeroPaddingLayer
Zero padding layer for convolutional neural networks.
|
Modifier and Type | Method and Description |
---|---|
abstract <E extends Layer> |
Layer.Builder.build() |
Modifier and Type | Method and Description |
---|---|
Layer |
Layer.clone() |
Modifier and Type | Method and Description |
---|---|
static void |
LayerValidation.generalValidation(String layerName,
Layer layer,
boolean useRegularization,
boolean useDropConnect,
double dropOut,
double l2,
double l2Bias,
double l1,
double l1Bias,
Distribution dist) |
static void |
LayerValidation.generalValidation(String layerName,
Layer layer,
boolean useRegularization,
boolean useDropConnect,
Double dropOut,
Double l2,
Double l2Bias,
Double l1,
Double l1Bias,
Distribution dist) |
static void |
LayerValidation.updaterValidation(String layerName,
Layer layer,
Double learningRate,
Double momentum,
Map<Integer,Double> momentumSchedule,
Double adamMeanDecay,
Double adamVarDecay,
Double rho,
Double rmsDecay,
Double epsilon)
Validate the updater configuration - setting the default updater values, if necessary
|
Modifier and Type | Class and Description |
---|---|
class |
FrozenLayer
Created by Alex on 10/07/2017.
|
Modifier and Type | Field and Description |
---|---|
protected Layer |
FrozenLayer.layer |
Modifier and Type | Method and Description |
---|---|
Layer |
FrozenLayer.clone() |
Modifier and Type | Method and Description |
---|---|
FrozenLayer.Builder |
FrozenLayer.Builder.layer(Layer layer) |
Constructor and Description |
---|
FrozenLayer(Layer layer) |
Modifier and Type | Class and Description |
---|---|
class |
VariationalAutoencoder
Variational Autoencoder layer
|
Modifier and Type | Method and Description |
---|---|
protected void |
BaseNetConfigDeserializer.handleUpdaterBackwardCompatibility(Layer[] layers) |
Modifier and Type | Class and Description |
---|---|
class |
AbstractLayer<LayerConfT extends Layer>
A layer with input and output, no parameters or gradients
|
Modifier and Type | Method and Description |
---|---|
int |
FrozenLayerParamInitializer.numParams(Layer layer) |
int |
ConvolutionParamInitializer.numParams(Layer l) |
int |
EmptyParamInitializer.numParams(Layer layer) |
int |
BatchNormalizationParamInitializer.numParams(Layer l) |
int |
LSTMParamInitializer.numParams(Layer l) |
int |
GravesBidirectionalLSTMParamInitializer.numParams(Layer l) |
int |
GravesLSTMParamInitializer.numParams(Layer l) |
int |
DefaultParamInitializer.numParams(Layer l) |
Modifier and Type | Method and Description |
---|---|
TransferLearning.Builder |
TransferLearning.Builder.addLayer(Layer layer)
Add layers to the net
Required if layers are removed.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.addLayer(String layerName,
Layer layer,
InputPreProcessor preProcessor,
String... layerInputs)
Add a layer with a specified preprocessor
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.addLayer(String layerName,
Layer layer,
String... layerInputs)
Add a layer of the specified configuration to the computation graph
|
Copyright © 2017. All rights reserved.