FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.activation(Activation activation) |
Activation function / neuron non-linearity
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.activation(IActivation activationFn) |
Activation function / neuron non-linearity
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.backprop(boolean backprop) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.backpropType(BackpropType backpropType) |
The type of backprop.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.biasInit(double biasInit) |
Constant for bias initialization.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.biasUpdater(IUpdater biasUpdater) |
Gradient updater configuration, for the biases only.
|
static FineTuneConfiguration.Builder |
FineTuneConfiguration.builder() |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.constraints(List<LayerConstraint> constraints) |
Set constraints to be applied to all layers.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.convolutionMode(ConvolutionMode convolutionMode) |
Sets the convolution mode for convolutional layers, which impacts padding and output sizes.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.cudnnAlgoMode(ConvolutionLayer.AlgoMode cudnnAlgoMode) |
Sets the cuDNN algo mode for convolutional layers, which impacts performance and memory usage of cuDNN.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.dist(Distribution dist) |
Deprecated.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.dropout(IDropout dropout) |
Set the dropout
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.dropOut(double inputRetainProbability) |
Dropout probability.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.gradientNormalization(GradientNormalization gradientNormalization) |
Gradient normalization strategy.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.gradientNormalizationThreshold(double gradientNormalizationThreshold) |
Threshold for gradient normalization, only used for GradientNormalization.ClipL2PerLayer,
GradientNormalization.ClipL2PerParamType, and GradientNormalization.ClipElementWiseAbsoluteValue
Not used otherwise.
L2 threshold for first two types of clipping, or absolute value threshold for last type of clipping
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.inferenceWorkspaceMode(WorkspaceMode inferenceWorkspaceMode) |
This method defines Workspace mode being used during inference:
NONE: workspace won't be used
ENABLED: workspaces will be used for inference (reduced memory and better performance)
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.l1(double l1) |
L1 regularization coefficient for the weights (excluding biases)
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.l1Bias(double l1Bias) |
L1 regularization coefficient for the bias parameters
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.l2(double l2) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.l2Bias(double l2Bias) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.maxNumLineSearchIterations(int maxNumLineSearchIterations) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.miniBatch(boolean miniBatch) |
Whether scores and gradients should be divided by the minibatch size.
Most users should leave this ast he default value of true.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.minimize(boolean minimize) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.optimizationAlgo(OptimizationAlgorithm optimizationAlgo) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.pretrain(boolean pretrain) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.seed(int seed) |
RNG seed for reproducibility
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.seed(long seed) |
RNG seed for reproducibility
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.stepFunction(StepFunction stepFunction) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.tbpttBackLength(int tbpttBackLength) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.tbpttFwdLength(int tbpttFwdLength) |
When doing truncated BPTT: how many steps of forward pass should we do
before doing (truncated) backprop?
Only applicable when doing backpropType(BackpropType.TruncatedBPTT)
Typically tBPTTForwardLength parameter is same as the tBPTTBackwardLength parameter,
but may be larger than it in some circumstances (but never smaller)
Ideally your training data time series length should be divisible by this
This is the k1 parameter on pg23 of
http://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdf
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.trainingWorkspaceMode(WorkspaceMode trainingWorkspaceMode) |
This method defines Workspace mode being used during training:
NONE: workspace won't be used
ENABLED: workspaces will be used for training (reduced memory and better performance)
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.updater(Updater updater) |
Deprecated.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.updater(IUpdater updater) |
Gradient updater configuration.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightDecay(double coefficient) |
Add weight decay regularization for the network parameters (excluding biases).
This applies weight decay with multiplying the learning rate - see WeightDecay for more details.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightDecay(double coefficient,
boolean applyLR) |
Add weight decay regularization for the network parameters (excluding biases).
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightDecayBias(double coefficient) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightDecayBias(double coefficient,
boolean applyLR) |
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightInit(Distribution distribution) |
Set weight initialization scheme to random sampling via the specified distribution.
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightInit(IWeightInit weightInit) |
Weight initialization scheme to use, for initial weight values
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightInit(WeightInit weightInit) |
Weight initialization scheme to use, for initial weight values
|
FineTuneConfiguration.Builder |
FineTuneConfiguration.Builder.weightNoise(IWeightNoise weightNoise) |
|