public class FrozenLayer extends BaseWrapperLayer
Layer.TrainingMode, Layer.Type
underlying
Constructor and Description |
---|
FrozenLayer(Layer insideLayer) |
Modifier and Type | Method and Description |
---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
INDArray |
activate(INDArray input,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the specified input
|
void |
applyConstraints(int iteration,
int epoch)
Apply any constraints to the model
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcRegularizationScore(boolean backpropParamsOnly)
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss function |
void |
computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Update the score
|
void |
fit()
All models have a fit method
|
void |
fit(INDArray data,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
TrainingConfig |
getConfig() |
Layer |
getInsideLayer() |
Gradient |
gradient()
Get the gradient.
|
Pair<Gradient,Double> |
gradientAndScore()
Get the gradient and score
|
void |
init()
Init the model
|
protected String |
layerId() |
void |
logTestMode(boolean training) |
void |
logTestMode(Layer.TrainingMode training) |
void |
setBackpropGradientsViewArray(INDArray gradients)
Set the gradients array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setCacheMode(CacheMode mode)
This method sets given CacheMode for current layer
|
void |
update(Gradient gradient)
Update layer weights and biases with gradient change
|
void |
update(INDArray gradient,
String paramType)
Perform one update applying the gradient
|
addListeners, allowInputModification, batchSize, clear, clearNoiseWeightParams, conf, feedForwardMaskArray, getEpochCount, getGradientsViewArray, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, getOptimizer, getParam, input, isPretrainLayer, numParams, numParams, params, paramTable, paramTable, score, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, type, updaterDivideByMinibatch
public FrozenLayer(Layer insideLayer)
public void setCacheMode(CacheMode mode)
Layer
setCacheMode
in interface Layer
setCacheMode
in class BaseWrapperLayer
protected String layerId()
public double calcRegularizationScore(boolean backpropParamsOnly)
Layer
calcRegularizationScore
in interface Layer
calcRegularizationScore
in class BaseWrapperLayer
backpropParamsOnly
- If true: calculate regularization score based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layer
backpropGradient
in interface Layer
backpropGradient
in class BaseWrapperLayer
epsilon
- w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr
- Workspace managerArrayType.ACTIVATION_GRAD
workspace via the workspace managerpublic INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
activate
in interface Layer
activate
in class BaseWrapperLayer
training
- training or test modeworkspaceMgr
- Workspace managerArrayType.ACTIVATIONS
workspace via the workspace managerpublic INDArray activate(INDArray input, boolean training, LayerWorkspaceMgr workspaceMgr)
Layer
activate
in interface Layer
activate
in class BaseWrapperLayer
input
- the input to usetraining
- train or test modeworkspaceMgr
- Workspace manager.ArrayType.ACTIVATIONS
workspace via the workspace managerpublic void fit()
Model
fit
in interface Model
fit
in class BaseWrapperLayer
public void update(Gradient gradient)
Model
update
in interface Model
update
in class BaseWrapperLayer
public void update(INDArray gradient, String paramType)
Model
update
in interface Model
update
in class BaseWrapperLayer
gradient
- the gradient to applypublic void computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Model
computeGradientAndScore
in interface Model
computeGradientAndScore
in class BaseWrapperLayer
public void setBackpropGradientsViewArray(INDArray gradients)
Model
setBackpropGradientsViewArray
in interface Model
setBackpropGradientsViewArray
in class BaseWrapperLayer
gradients
- a 1 x nParams row vector that is a view of the larger (MLN/CG) gradients arraypublic void fit(INDArray data, LayerWorkspaceMgr workspaceMgr)
Model
fit
in interface Model
fit
in class BaseWrapperLayer
data
- the data to fit the model topublic Gradient gradient()
Model
Model.computeGradientAndScore(LayerWorkspaceMgr)
} .gradient
in interface Model
gradient
in class BaseWrapperLayer
public Pair<Gradient,Double> gradientAndScore()
Model
gradientAndScore
in interface Model
gradientAndScore
in class BaseWrapperLayer
public void applyConstraints(int iteration, int epoch)
Model
applyConstraints
in interface Model
applyConstraints
in class BaseWrapperLayer
public void init()
init
in interface Model
init
in class BaseWrapperLayer
public void logTestMode(boolean training)
public void logTestMode(Layer.TrainingMode training)
public Layer getInsideLayer()
public TrainingConfig getConfig()
getConfig
in interface Trainable
getConfig
in class BaseWrapperLayer
Copyright © 2019. All rights reserved.