public class Dropout extends Object implements IDropout
Modifier and Type | Field and Description |
---|---|
static String |
CUDNN_DROPOUT_HELPER_CLASS_NAME |
protected boolean |
helperAllowFallback
When using CuDNN and an error is encountered, should fallback to the non-CuDNN implementatation be allowed?
If set to false, an exception in CuDNN will be propagated back to the user.
|
Modifier | Constructor and Description |
---|---|
|
Dropout(double activationRetainProbability) |
protected |
Dropout(double activationRetainProbability,
ISchedule activationRetainProbabilitySchedule) |
|
Dropout(ISchedule activationRetainProbabilitySchedule) |
Modifier and Type | Method and Description |
---|---|
INDArray |
applyDropout(INDArray inputActivations,
INDArray output,
int iteration,
int epoch,
LayerWorkspaceMgr workspaceMgr) |
INDArray |
backprop(INDArray gradAtOutput,
INDArray gradAtInput,
int iteration,
int epoch)
Perform backprop.
|
void |
clear()
Clear the internal state (for example, dropout mask) if any is present
|
Dropout |
clone() |
Dropout |
helperAllowFallback(boolean allowFallback)
When using a helper (CuDNN or MKLDNN in some cases) and an error is encountered, should fallback to the non-helper implementation be allowed?
If set to false, an exception in the helper will be propagated back to the user.
|
protected void |
initializeHelper(DataType dataType)
Initialize the CuDNN dropout helper, if possible
|
protected boolean helperAllowFallback
public static final String CUDNN_DROPOUT_HELPER_CLASS_NAME
public Dropout(double activationRetainProbability)
activationRetainProbability
- Probability of retaining an activation - see Dropout
javadocpublic Dropout(ISchedule activationRetainProbabilitySchedule)
activationRetainProbabilitySchedule
- Schedule for probability of retaining an activation - see Dropout
javadocprotected Dropout(double activationRetainProbability, ISchedule activationRetainProbabilitySchedule)
public Dropout helperAllowFallback(boolean allowFallback)
allowFallback
- Whether fallback to non-helper implementation should be usedprotected void initializeHelper(DataType dataType)
public INDArray applyDropout(INDArray inputActivations, INDArray output, int iteration, int epoch, LayerWorkspaceMgr workspaceMgr)
applyDropout
in interface IDropout
inputActivations
- Input activations arrayoutput
- The result array (same as inputArray for in-place ops) for the post-dropout activationsiteration
- Current iteration numberepoch
- Current epoch numberworkspaceMgr
- Workspace manager, if any storage is required (use ArrayType.INPUT)public INDArray backprop(INDArray gradAtOutput, INDArray gradAtInput, int iteration, int epoch)
IDropout
backprop
in interface IDropout
gradAtOutput
- Gradients at the output of the dropout op - i.e., dL/dOutgradAtInput
- Gradients at the input of the dropout op - i.e., dL/dIn. Use the same array as gradAtOutput
to apply the backprop gradient in-placeiteration
- Current iterationepoch
- Current epochpublic void clear()
IDropout
Copyright © 2022. All rights reserved.