Package | Description |
---|---|
org.deeplearning4j.nn.conf | |
org.deeplearning4j.nn.conf.layers | |
org.deeplearning4j.nn.conf.layers.setup | |
org.deeplearning4j.nn.layers | |
org.deeplearning4j.nn.layers.recurrent |
Modifier and Type | Field and Description |
---|---|
protected Layer |
NeuralNetConfiguration.layer |
protected Layer |
NeuralNetConfiguration.Builder.layer |
Modifier and Type | Method and Description |
---|---|
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.addLayer(String layerName,
Layer layer,
InputPreProcessor preProcessor,
String... layerInputs)
Add a layer and an
InputPreProcessor , with the specified name and specified inputs. |
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.addLayer(String layerName,
Layer layer,
String... layerInputs)
Add a layer, with no
InputPreProcessor , with the specified name and specified inputs. |
NeuralNetConfiguration.ListBuilder |
NeuralNetConfiguration.ListBuilder.layer(int ind,
Layer layer) |
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.layer(Layer layer)
Layer class.
|
NeuralNetConfiguration.ListBuilder |
NeuralNetConfiguration.Builder.list(Layer... layers)
Create a ListBuilder (for creating a MultiLayerConfiguration) with the specified layers
Usage: |
Modifier and Type | Class and Description |
---|---|
class |
ActivationLayer |
class |
AutoEncoder
Autoencoder.
|
class |
BaseOutputLayer |
class |
BasePretrainNetwork |
class |
BaseRecurrentLayer |
class |
BatchNormalization
Batch normalization configuration
|
class |
ConvolutionLayer |
class |
DenseLayer
Dense layer: fully connected feed forward layer trainable by backprop.
|
class |
EmbeddingLayer
Embedding layer: feed-forward layer that expects single integers per example as input (class numbers, in range 0 to numClass-1)
as input.
|
class |
FeedForwardLayer
Created by jeffreytang on 7/21/15.
|
class |
GravesBidirectionalLSTM
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
class |
GravesLSTM
LSTM recurrent net, based on Graves: Supervised Sequence Labelling with Recurrent Neural Networks
http://www.cs.toronto.edu/~graves/phd.pdf
|
class |
LocalResponseNormalization
Created by nyghtowl on 10/29/15.
|
class |
OutputLayer
Output layer with different objective co-occurrences for different objectives.
|
class |
RBM
Restricted Boltzmann Machine.
|
class |
RnnOutputLayer |
class |
SubsamplingLayer
Subsampling layer also referred to as pooling in convolution neural nets
Supports the following pooling types:
MAX
AVG
NON
|
Modifier and Type | Method and Description |
---|---|
abstract <E extends Layer> |
Layer.Builder.build() |
Modifier and Type | Method and Description |
---|---|
Layer |
Layer.clone() |
Modifier and Type | Method and Description |
---|---|
Layer |
ConvolutionLayerSetup.getLayer(int i,
MultiLayerConfiguration.Builder builder)
Deprecated.
|
Modifier and Type | Class and Description |
---|---|
class |
BaseLayer<LayerConfT extends Layer>
A layer with a bias
and activation function
|
Modifier and Type | Class and Description |
---|---|
class |
BaseRecurrentLayer<LayerConfT extends Layer> |
Copyright © 2016. All Rights Reserved.