Package | Description |
---|---|
org.deeplearning4j.nn.transferlearning |
Modifier and Type | Method and Description |
---|---|
TransferLearning.Builder |
TransferLearning.Builder.addLayer(Layer layer)
Add layers to the net
Required if layers are removed.
|
TransferLearning.Builder |
TransferLearning.Builder.fineTuneConfiguration(FineTuneConfiguration finetuneConfiguration)
Fine tune configurations specified will overwrite the existing configuration if any
Usage example: specify a learning rate will set specified learning rate on all layers
Refer to the fineTuneConfiguration class for more details
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist,
Distribution distNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
Distribution dist,
WeightInit schemeNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
WeightInit scheme)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
WeightInit scheme,
Distribution distNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
TransferLearning.Builder.nOutReplace(int layerNum,
int nOut,
WeightInit scheme,
WeightInit schemeNext)
Modify the architecture of a layer by changing nOut
Note this will also affect the layer that follows the layer specified, unless it is the output layer
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.Builder |
TransferLearning.Builder.removeLayersFromOutput(int layerNum)
Remove last "n" layers of the net
At least an output layer must be added back in
|
TransferLearning.Builder |
TransferLearning.Builder.removeOutputLayer()
Helper method to remove the outputLayer of the net.
|
TransferLearning.Builder |
TransferLearning.Builder.setFeatureExtractor(int layerNum)
Specify a layer to set as a "feature extractor"
The specified layer and the layers preceding it will be "frozen" with parameters staying constant
|
TransferLearning.Builder |
TransferLearning.Builder.setInputPreProcessor(int layer,
InputPreProcessor processor)
Specify the preprocessor for the added layers
for cases where they cannot be inferred automatically.
|
Copyright © 2018. All rights reserved.