Package | Description |
---|---|
org.deeplearning4j.nn.transferlearning |
Modifier and Type | Method and Description |
---|---|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.addInputs(String... inputNames) |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.addLayer(String layerName,
Layer layer,
InputPreProcessor preProcessor,
String... layerInputs)
Add a layer with a specified preprocessor
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.addLayer(String layerName,
Layer layer,
String... layerInputs)
Add a layer of the specified configuration to the computation graph
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.addVertex(String vertexName,
GraphVertex vertex,
String... vertexInputs)
Add a vertex of the given configuration to the computation graph
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.fineTuneConfiguration(FineTuneConfiguration fineTuneConfiguration)
Set parameters to selectively override existing learning parameters
Usage eg.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist)
Modify the architecture of a vertex layer by changing nOut
Note this will also affect the vertex layer that follows the layer specified, unless it is the output layer
Currently does not support modifying nOut of layers that feed into non-layer vertices like merge, subset etc
To modify nOut for such vertices use remove vertex, followed by add vertex
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist,
Distribution distNext)
Modified nOut of specified layer.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
Distribution dist,
WeightInit scheme) |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
WeightInit scheme)
Modify the architecture of a vertex layer by changing nOut
Note this will also affect the vertex layer that follows the layer specified, unless it is the output layer
Currently does not support modifying nOut of layers that feed into non-layer vertices like merge, subset etc
To modify nOut for such vertices use remove vertex, followed by add vertex
Can specify different weight init schemes for the specified layer and the layer that follows it.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
WeightInit scheme,
Distribution dist) |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.nOutReplace(String layerName,
int nOut,
WeightInit scheme,
WeightInit schemeNext) |
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.removeVertexAndConnections(String vertexName)
Remove specified vertex and it's connections from the computation graph
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.removeVertexKeepConnections(String outputName)
Remove the specified vertex from the computation graph but keep it's connections.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.setFeatureExtractor(String... layerName)
Specify a layer vertex to set as a "feature extractor"
The specified layer vertex and the layers on the path from an input vertex to it it will be "frozen" with parameters staying constant
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.setInputs(String... inputs)
Sets new inputs for the computation graph.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.setInputTypes(InputType... inputTypes)
Sets the input type of corresponding inputs.
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.setOutputs(String... outputNames)
Set outputs to the computation graph, will add to ones that are existing
Also determines the order, like in ComputationGraphConfiguration
|
TransferLearning.GraphBuilder |
TransferLearning.GraphBuilder.setWorkspaceMode(WorkspaceMode workspaceMode) |
Copyright © 2017. All rights reserved.