Package | Description |
---|---|
org.deeplearning4j.nn.graph.vertex.impl | |
org.deeplearning4j.nn.graph.vertex.impl.rnn |
Modifier and Type | Class and Description |
---|---|
class |
ElementWiseVertex
An ElementWiseVertex is used to combine the activations of two or more layer in an element-wise manner
For example, the activations may be combined by addition, subtraction or multiplication. |
class |
InputVertex
An InputVertex simply defines the location (and connection structure) of inputs to the ComputationGraph.
|
class |
LayerVertex
LayerVertex is a GraphVertex with a neural network Layer (and, optionally an
InputPreProcessor ) in it |
class |
MergeVertex
A MergeVertex is used to combine the activations of two or more layers/GraphVertex by means of concatenation/merging.
Exactly how this is done depends on the type of input. For 2d (feed forward layer) inputs: MergeVertex([numExamples,layerSize1],[numExamples,layerSize2]) -> [numExamples,layerSize1 + layerSize2] For 3d (time series) inputs: MergeVertex([numExamples,layerSize1,timeSeriesLength],[numExamples,layerSize2,timeSeriesLength]) -> [numExamples,layerSize1 + layerSize2,timeSeriesLength] For 4d (convolutional) inputs: MergeVertex([numExamples,depth1,width,height],[numExamples,depth2,width,height]) -> [numExamples,depth1 + depth2,width,height] |
class |
PreprocessorVertex
PreprocessorVertex is a simple adaptor class that allows a
InputPreProcessor to be used in a ComputationGraph
GraphVertex, without it being associated with a layer. |
class |
SubsetVertex
SubsetVertex is used to select a subset of the activations out of another GraphVertex.
For example, a subset of the activations out of a layer. Note that this subset is specifying by means of an interval of the original activations. |
Modifier and Type | Class and Description |
---|---|
class |
DuplicateToTimeSeriesVertex
DuplicateToTimeSeriesVertex is a vertex that goes from 2d activations to a 3d time series activations, by means of
duplication.
|
class |
LastTimeStepVertex
LastTimeStepVertex is used in the context of recurrent neural network activations, to go from 3d (time series)
activations to 2d activations, by extracting out the last time step of activations for each example.
This can be used for example in sequence to sequence architectures, and potentially for sequence classification. |
Copyright © 2016. All Rights Reserved.