public abstract class BasePretrainNetwork<LayerConfT extends BasePretrainNetwork> extends BaseLayer<LayerConfT>
Layer.TrainingMode, Layer.Type
Modifier and Type | Field and Description |
---|---|
protected Collection<TrainingListener> |
trainingListeners |
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solver
cacheMode, conf, dropoutApplied, dropoutMask, index, input, iterationListeners, maskArray, maskState, preOutput
Constructor and Description |
---|
BasePretrainNetwork(NeuralNetConfiguration conf) |
BasePretrainNetwork(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
Modifier and Type | Method and Description |
---|---|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
protected Gradient |
createGradient(org.nd4j.linalg.api.ndarray.INDArray wGradient,
org.nd4j.linalg.api.ndarray.INDArray vBiasGradient,
org.nd4j.linalg.api.ndarray.INDArray hBiasGradient) |
org.nd4j.linalg.api.ndarray.INDArray |
getCorruptedInput(org.nd4j.linalg.api.ndarray.INDArray x,
double corruptionLevel)
Corrupts the given input by doing a binomial sampling
given the corruption level
|
int |
numParams()
The number of parameters for the model, for backprop (i.e., excluding visible bias)
|
int |
numParams(boolean backwards)
the number of parameters for the model
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
abstract org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
sampleHiddenGivenVisible(org.nd4j.linalg.api.ndarray.INDArray v)
Sample the hidden distribution given the visible
|
abstract org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
sampleVisibleGivenHidden(org.nd4j.linalg.api.ndarray.INDArray h)
Sample the visible distribution given the hidden
|
void |
setListeners(Collection<IterationListener> listeners)
Set the iteration listeners for this layer.
|
void |
setListeners(IterationListener... listeners)
Set the iteration listeners for this layer.
|
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
protected void |
setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z) |
accumulateScore, activate, activate, activate, activationMean, applyLearningRateScoreDecay, calcGradient, clone, computeGradientAndScore, error, fit, fit, getGradientsViewArray, getOptimizer, getParam, gradient, initParams, iterate, layerConf, merge, paramTable, preOutput, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParamsViewArray, setParamTable, toString, transpose, update, update
activate, activate, activate, addListeners, applyDropOutIfNecessary, applyMask, batchSize, clear, conf, derivativeActivation, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, preOutput, preOutput, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setMaskArray, type, validateInput
equals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, wait
isPretrainLayer
protected Collection<TrainingListener> trainingListeners
public BasePretrainNetwork(NeuralNetConfiguration conf)
public BasePretrainNetwork(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public void setListeners(Collection<IterationListener> listeners)
Layer
setListeners
in interface Layer
setListeners
in interface Model
setListeners
in class AbstractLayer<LayerConfT extends BasePretrainNetwork>
public void setListeners(IterationListener... listeners)
Layer
setListeners
in interface Layer
setListeners
in interface Model
setListeners
in class AbstractLayer<LayerConfT extends BasePretrainNetwork>
public org.nd4j.linalg.api.ndarray.INDArray getCorruptedInput(org.nd4j.linalg.api.ndarray.INDArray x, double corruptionLevel)
x
- the input to corruptcorruptionLevel
- the corruption valueprotected Gradient createGradient(org.nd4j.linalg.api.ndarray.INDArray wGradient, org.nd4j.linalg.api.ndarray.INDArray vBiasGradient, org.nd4j.linalg.api.ndarray.INDArray hBiasGradient)
public int numParams(boolean backwards)
Model
numParams
in interface Model
numParams
in class AbstractLayer<LayerConfT extends BasePretrainNetwork>
public abstract org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> sampleHiddenGivenVisible(org.nd4j.linalg.api.ndarray.INDArray v)
v
- the visible to sample frompublic abstract org.nd4j.linalg.primitives.Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> sampleVisibleGivenHidden(org.nd4j.linalg.api.ndarray.INDArray h)
h
- the hidden to sample fromprotected void setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z)
setScoreWithZ
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable(boolean backpropParamsOnly)
Model
paramTable
in interface Model
paramTable
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
backpropParamsOnly
- If true, return backprop params only. If false: return all params (equivalent to
paramsTable())public org.nd4j.linalg.api.ndarray.INDArray params()
BaseLayer
params
in interface Model
params
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public int numParams()
numParams
in interface Model
numParams
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
public void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Model
setParams
in interface Model
setParams
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
params
- the parameters for the modelpublic org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public double calcL2(boolean backpropParamsOnly)
Layer
calcL2
in interface Layer
calcL2
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
backpropParamsOnly
- If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
Layer
calcL1
in interface Layer
calcL1
in class BaseLayer<LayerConfT extends BasePretrainNetwork>
backpropParamsOnly
- If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)Copyright © 2017. All rights reserved.