Modifier and Type | Method and Description |
---|---|
List<String> |
ParamInitializer.biasKeys(Layer layer)
Bias parameter keys given the layer configuration
|
boolean |
ParamInitializer.isBiasParam(Layer layer,
String key)
Is the specified parameter a bias?
|
boolean |
ParamInitializer.isWeightParam(Layer layer,
String key)
Is the specified parameter a weight?
|
long |
ParamInitializer.numParams(Layer layer) |
List<String> |
ParamInitializer.paramKeys(Layer layer)
Get a list of all parameter keys given the layer configuration
|
List<String> |
ParamInitializer.weightKeys(Layer layer)
Weight parameter keys given the layer configuration
|
Modifier and Type | Field and Description |
---|---|
protected Layer |
NeuralNetConfiguration.layer |
protected Layer |
NeuralNetConfiguration.Builder.layer |
Modifier and Type | Method and Description |
---|---|
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.addLayer(String layerName,
Layer layer,
InputPreProcessor preProcessor,
String... layerInputs)
Add a layer and an
InputPreProcessor , with the specified name and specified inputs. |
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.addLayer(String layerName,
Layer layer,
String... layerInputs)
Add a layer, with no
InputPreProcessor , with the specified name and specified inputs. |
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.appendLayer(String layerName,
Layer layer)
Add a layer, with no
InputPreProcessor , with the specified name
and input from the last added layer/vertex. |
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.appendLayer(String layerName,
Layer layer,
InputPreProcessor preProcessor)
Add a layer and an
InputPreProcessor , with the specified name
and input from the last added layer/vertex. |
NeuralNetConfiguration.ListBuilder |
NeuralNetConfiguration.ListBuilder.layer(int ind,
@NonNull Layer layer) |
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.layer(int layerName,
Layer layer,
String... layerInputs)
Add a layer, with no
InputPreProcessor , with the specified name and specified inputs. |
NeuralNetConfiguration.ListBuilder |
NeuralNetConfiguration.ListBuilder.layer(Layer layer) |
NeuralNetConfiguration.Builder |
NeuralNetConfiguration.Builder.layer(Layer layer)
Layer class.
|
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.layer(String layerName,
Layer layer,
InputPreProcessor preProcessor,
String... layerInputs)
Add a layer and an
InputPreProcessor , with the specified name and specified inputs. |
ComputationGraphConfiguration.GraphBuilder |
ComputationGraphConfiguration.GraphBuilder.layer(String layerName,
Layer layer,
String... layerInputs)
Add a layer, with no
InputPreProcessor , with the specified name and specified inputs. |
NeuralNetConfiguration.ListBuilder |
NeuralNetConfiguration.Builder.list(Layer... layers)
Create a ListBuilder (for creating a MultiLayerConfiguration) with the specified layers
Usage: |
Modifier and Type | Method and Description |
---|---|
<E extends Layer> |
PrimaryCapsules.Builder.build() |
abstract <E extends Layer> |
Layer.Builder.build() |
<E extends Layer> |
CapsuleStrengthLayer.Builder.build() |
<E extends Layer> |
CapsuleLayer.Builder.build() |
Modifier and Type | Method and Description |
---|---|
Layer |
Layer.clone() |
Modifier and Type | Method and Description |
---|---|
static void |
LayerValidation.generalValidation(String layerName,
Layer layer,
IDropout iDropout,
List<Regularization> regularization,
List<Regularization> regularizationBias,
List<LayerConstraint> allParamConstraints,
List<LayerConstraint> weightConstraints,
List<LayerConstraint> biasConstraints) |
Modifier and Type | Class and Description |
---|---|
class |
Cropping1D |
class |
Cropping2D |
class |
Cropping3D |
Modifier and Type | Class and Description |
---|---|
class |
ElementWiseMultiplicationLayer |
class |
FrozenLayer |
class |
FrozenLayerWithBackprop
Frozen layer freezes parameters of the layer it wraps, but allows the backpropagation to continue.
|
class |
RepeatVector |
Modifier and Type | Field and Description |
---|---|
protected Layer |
FrozenLayer.layer |
Modifier and Type | Method and Description |
---|---|
Layer |
FrozenLayerWithBackprop.clone() |
Layer |
FrozenLayer.clone() |
Modifier and Type | Method and Description |
---|---|
FrozenLayer.Builder |
FrozenLayer.Builder.layer(Layer layer) |
Constructor and Description |
---|
FrozenLayer(Layer layer) |
FrozenLayerWithBackprop(Layer layer) |
Modifier and Type | Class and Description |
---|---|
class |
Yolo2OutputLayer |
Modifier and Type | Class and Description |
---|---|
class |
Bidirectional |
class |
LastTimeStep |
class |
SimpleRnn |
class |
TimeDistributed |
Modifier and Type | Method and Description |
---|---|
Layer |
LastTimeStep.getUnderlying() |
Modifier and Type | Method and Description |
---|---|
Bidirectional.Builder |
Bidirectional.Builder.rnnLayer(Layer layer) |
void |
Bidirectional.Builder.setLayer(Layer layer) |
Constructor and Description |
---|
Bidirectional(@NonNull Bidirectional.Mode mode,
@NonNull Layer layer)
Create a Bidirectional wrapper for the specified layer
|
Bidirectional(@NonNull Layer layer)
Create a Bidirectional wrapper, with the default Mode (CONCAT) for the specified layer
|
LastTimeStep(Layer underlying) |
TimeDistributed(Layer underlying) |
TimeDistributed(@NonNull Layer underlying,
RNNFormat rnnDataFormat) |
Modifier and Type | Class and Description |
---|---|
class |
AbstractSameDiffLayer |
class |
SameDiffLambdaLayer |
class |
SameDiffLayer |
class |
SameDiffOutputLayer |
Modifier and Type | Class and Description |
---|---|
class |
MaskLayer |
class |
MaskZeroLayer |
Modifier and Type | Method and Description |
---|---|
MaskZeroLayer.Builder |
MaskZeroLayer.Builder.setUnderlying(Layer underlying) |
MaskZeroLayer.Builder |
MaskZeroLayer.Builder.underlying(Layer underlying) |
Constructor and Description |
---|
MaskZeroLayer(Layer underlying,
double maskingValue) |
Modifier and Type | Class and Description |
---|---|
class |
VariationalAutoencoder |
Modifier and Type | Class and Description |
---|---|
class |
BaseWrapperLayer |
Modifier and Type | Field and Description |
---|---|
protected Layer |
BaseWrapperLayer.underlying |
Constructor and Description |
---|
BaseWrapperLayer(Layer underlying) |
Modifier and Type | Class and Description |
---|---|
class |
OCNNOutputLayer |
Modifier and Type | Method and Description |
---|---|
protected boolean |
BaseNetConfigDeserializer.requiresActivationFromLegacy(Layer[] layers) |
protected boolean |
BaseNetConfigDeserializer.requiresDropoutFromLegacy(Layer[] layers) |
protected boolean |
BaseNetConfigDeserializer.requiresIUpdaterFromLegacy(Layer[] layers) |
protected boolean |
BaseNetConfigDeserializer.requiresLegacyLossHandling(Layer[] layers) |
protected boolean |
BaseNetConfigDeserializer.requiresRegularizationFromLegacy(Layer[] layers) |
protected boolean |
BaseNetConfigDeserializer.requiresWeightInitFromLegacy(Layer[] layers) |
Modifier and Type | Class and Description |
---|---|
class |
AbstractLayer<LayerConfT extends Layer>
A layer with input and output, no parameters or gradients
|
Modifier and Type | Method and Description |
---|---|
List<String> |
OCNNParamInitializer.biasKeys(Layer layer) |
boolean |
OCNNParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
OCNNParamInitializer.isWeightParam(Layer layer,
String key) |
long |
OCNNParamInitializer.numParams(Layer layer) |
List<String> |
OCNNParamInitializer.paramKeys(Layer layer) |
List<String> |
OCNNParamInitializer.weightKeys(Layer layer) |
Modifier and Type | Class and Description |
---|---|
class |
IdentityLayer |
Modifier and Type | Method and Description |
---|---|
List<String> |
WrapperLayerParamInitializer.biasKeys(Layer layer) |
List<String> |
LSTMParamInitializer.biasKeys(Layer layer) |
List<String> |
BatchNormalizationParamInitializer.biasKeys(Layer layer) |
List<String> |
SeparableConvolutionParamInitializer.biasKeys(Layer layer) |
List<String> |
GravesLSTMParamInitializer.biasKeys(Layer layer) |
List<String> |
ConvolutionParamInitializer.biasKeys(Layer layer) |
List<String> |
EmptyParamInitializer.biasKeys(Layer layer) |
List<String> |
BidirectionalParamInitializer.biasKeys(Layer layer) |
List<String> |
GravesBidirectionalLSTMParamInitializer.biasKeys(Layer layer) |
List<String> |
DepthwiseConvolutionParamInitializer.biasKeys(Layer layer) |
List<String> |
SameDiffParamInitializer.biasKeys(Layer layer) |
List<String> |
SimpleRnnParamInitializer.biasKeys(Layer layer) |
List<String> |
PReLUParamInitializer.biasKeys(Layer layer) |
List<String> |
FrozenLayerParamInitializer.biasKeys(Layer layer) |
List<String> |
VariationalAutoencoderParamInitializer.biasKeys(Layer layer) |
List<String> |
FrozenLayerWithBackpropParamInitializer.biasKeys(Layer layer) |
List<String> |
DefaultParamInitializer.biasKeys(Layer layer) |
protected boolean |
DefaultParamInitializer.hasBias(Layer layer) |
protected boolean |
SimpleRnnParamInitializer.hasLayerNorm(Layer layer) |
protected boolean |
DefaultParamInitializer.hasLayerNorm(Layer layer) |
boolean |
WrapperLayerParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
LSTMParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
BatchNormalizationParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
SeparableConvolutionParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
GravesLSTMParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
ConvolutionParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
EmptyParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
BidirectionalParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
GravesBidirectionalLSTMParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
DepthwiseConvolutionParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
SameDiffParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
SimpleRnnParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
PReLUParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
FrozenLayerParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
VariationalAutoencoderParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
FrozenLayerWithBackpropParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
DefaultParamInitializer.isBiasParam(Layer layer,
String key) |
boolean |
WrapperLayerParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
LSTMParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
BatchNormalizationParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
SeparableConvolutionParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
GravesLSTMParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
ConvolutionParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
EmptyParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
BidirectionalParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
GravesBidirectionalLSTMParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
DepthwiseConvolutionParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
SameDiffParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
SimpleRnnParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
PReLUParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
FrozenLayerParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
VariationalAutoencoderParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
FrozenLayerWithBackpropParamInitializer.isWeightParam(Layer layer,
String key) |
boolean |
DefaultParamInitializer.isWeightParam(Layer layer,
String key) |
long |
WrapperLayerParamInitializer.numParams(Layer layer) |
long |
LSTMParamInitializer.numParams(Layer l) |
long |
BatchNormalizationParamInitializer.numParams(Layer l) |
long |
SeparableConvolutionParamInitializer.numParams(Layer l) |
long |
GravesLSTMParamInitializer.numParams(Layer l) |
long |
ConvolutionParamInitializer.numParams(Layer l) |
long |
EmptyParamInitializer.numParams(Layer layer) |
long |
Convolution3DParamInitializer.numParams(Layer l) |
long |
BidirectionalParamInitializer.numParams(Layer layer) |
long |
Deconvolution3DParamInitializer.numParams(Layer l) |
long |
ElementWiseParamInitializer.numParams(Layer layer) |
long |
GravesBidirectionalLSTMParamInitializer.numParams(Layer l) |
long |
DepthwiseConvolutionParamInitializer.numParams(Layer l) |
long |
SameDiffParamInitializer.numParams(Layer layer) |
long |
SimpleRnnParamInitializer.numParams(Layer layer) |
long |
PReLUParamInitializer.numParams(Layer l) |
long |
FrozenLayerParamInitializer.numParams(Layer layer) |
long |
FrozenLayerWithBackpropParamInitializer.numParams(Layer layer) |
long |
DefaultParamInitializer.numParams(Layer l) |
List<String> |
WrapperLayerParamInitializer.paramKeys(Layer layer) |
List<String> |
LSTMParamInitializer.paramKeys(Layer layer) |
List<String> |
BatchNormalizationParamInitializer.paramKeys(Layer layer) |
List<String> |
SeparableConvolutionParamInitializer.paramKeys(Layer layer) |
List<String> |
GravesLSTMParamInitializer.paramKeys(Layer layer) |
List<String> |
ConvolutionParamInitializer.paramKeys(Layer layer) |
List<String> |
EmptyParamInitializer.paramKeys(Layer layer) |
List<String> |
BidirectionalParamInitializer.paramKeys(Layer layer) |
List<String> |
GravesBidirectionalLSTMParamInitializer.paramKeys(Layer layer) |
List<String> |
DepthwiseConvolutionParamInitializer.paramKeys(Layer layer) |
List<String> |
SameDiffParamInitializer.paramKeys(Layer layer) |
List<String> |
SimpleRnnParamInitializer.paramKeys(Layer layer) |
List<String> |
PReLUParamInitializer.paramKeys(Layer layer) |
List<String> |
FrozenLayerParamInitializer.paramKeys(Layer layer) |
List<String> |
VariationalAutoencoderParamInitializer.paramKeys(Layer l) |
List<String> |
FrozenLayerWithBackpropParamInitializer.paramKeys(Layer layer) |
List<String> |
DefaultParamInitializer.paramKeys(Layer layer) |
List<String> |
WrapperLayerParamInitializer.weightKeys(Layer layer) |
List<String> |
LSTMParamInitializer.weightKeys(Layer layer) |
List<String> |
BatchNormalizationParamInitializer.weightKeys(Layer layer) |
List<String> |
SeparableConvolutionParamInitializer.weightKeys(Layer layer) |
List<String> |
GravesLSTMParamInitializer.weightKeys(Layer layer) |
List<String> |
ConvolutionParamInitializer.weightKeys(Layer layer) |
List<String> |
EmptyParamInitializer.weightKeys(Layer layer) |
List<String> |
BidirectionalParamInitializer.weightKeys(Layer layer) |
List<String> |
GravesBidirectionalLSTMParamInitializer.weightKeys(Layer layer) |
List<String> |
DepthwiseConvolutionParamInitializer.weightKeys(Layer layer) |
List<String> |
SameDiffParamInitializer.weightKeys(Layer layer) |
List<String> |
SimpleRnnParamInitializer.weightKeys(Layer layer) |
List<String> |
PReLUParamInitializer.weightKeys(Layer layer) |
List<String> |
FrozenLayerParamInitializer.weightKeys(Layer layer) |
List<String> |
VariationalAutoencoderParamInitializer.weightKeys(Layer layer) |
List<String> |
FrozenLayerWithBackpropParamInitializer.weightKeys(Layer layer) |
List<String> |
DefaultParamInitializer.weightKeys(Layer layer) |
Modifier and Type | Method and Description |
---|---|
TransferLearning.Builder |
TransferLearning.Builder.addLayer(Layer layer)
Add layers to the net
Required if layers are removed.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.addLayer(String layerName,
Layer layer,
InputPreProcessor preProcessor,
String... layerInputs)
Add a layer with a specified preprocessor
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.addLayer(String layerName,
Layer layer,
String... layerInputs)
Add a layer of the specified configuration to the computation graph
|
Modifier and Type | Method and Description |
---|---|
static Convolution3D.DataFormat |
Convolution3DUtils.getFormatForLayer(Layer inputLayer)
Returns the
Convolution3D.DataFormat
for the associated layer. |
static CNN2DFormat |
ConvolutionUtils.getFormatForLayer(Layer layer)
Get the format for a given layer.
|
static RNNFormat |
TimeSeriesUtils.getFormatFromRnnLayer(Layer layer)
Get the
RNNFormat from the RNN layer, accounting for the presence of wrapper layers like Bidirectional,
LastTimeStep, etc |
static RNNFormat |
Convolution1DUtils.getRnnFormatFromLayer(Layer layer)
Get the
RNNFormat for the given layer. |
static boolean |
Convolution1DUtils.hasRnnDataFormat(Layer layer)
Returns true if the given layer has an
RNNFormat . |
static boolean |
Convolution3DUtils.layerHasConvolution3DLayout(Layer layer)
Returns true if any of the layers are 3d convolution, pooling, or upsampling layers including:
Convolution3D , Deconvolution3D , Subsampling3DLayer , Upsampling3D |
static boolean |
ConvolutionUtils.layerHasConvolutionLayout(Layer layer)
Returns true if a layer has a
CNN2DFormat property. |
static void |
OutputLayerUtil.validateOutputLayer(String layerName,
Layer layer)
Validate the output layer (or loss layer) configuration, to detect invalid consfiugrations.
|
static void |
OutputLayerUtil.validateOutputLayerForClassifierEvaluation(Layer outputLayer,
Class<? extends IEvaluation> classifierEval)
Validates if the output layer configuration is valid for classifier evaluation.
|
Copyright © 2022. All rights reserved.