public abstract class BaseLayer<LayerConfT extends BaseLayer> extends AbstractLayer<LayerConfT>
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected Gradient |
gradient |
protected org.nd4j.linalg.api.ndarray.INDArray |
gradientsFlattened |
protected Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
gradientViews |
protected ConvexOptimizer |
optimizer |
protected Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
params |
protected org.nd4j.linalg.api.ndarray.INDArray |
paramsFlattened |
protected double |
score |
protected Solver |
solver |
protected Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
weightNoiseParams |
cacheMode, conf, dropoutApplied, dropoutMask, epochCount, index, input, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
BaseLayer(NeuralNetConfiguration conf) |
BaseLayer(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
void |
accumulateScore(double accum)
Sets a rolling tally for the score.
|
org.nd4j.linalg.api.ndarray.INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
void |
clear()
Clear input
|
void |
clearNoiseWeightParams() |
Layer |
clone()
Clone the layer
|
void |
computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
Update the score
|
void |
fit()
All models have a fit method
|
void |
fit(org.nd4j.linalg.api.ndarray.INDArray input,
LayerWorkspaceMgr workspaceMgr)
Fit the model to the given data
|
org.nd4j.linalg.api.ndarray.INDArray |
getGradientsViewArray() |
ConvexOptimizer |
getOptimizer()
Returns this models optimizer
|
org.nd4j.linalg.api.ndarray.INDArray |
getParam(String param)
Get the parameter
|
protected org.nd4j.linalg.api.ndarray.INDArray |
getParamWithNoise(String param,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Get the parameter, after applying any weight noise (such as DropConnect) if necessary.
|
Gradient |
gradient()
Get the gradient.
|
boolean |
hasBias()
Does this layer have no bias term? Many layers (dense, convolutional, output, embedding) have biases by
default, but no-bias versions are possible via configuration
|
void |
initParams()
Initialize the parameters
|
LayerConfT |
layerConf() |
int |
numParams()
The number of parameters for the model
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
paramTable()
The param table
|
Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
protected org.nd4j.linalg.api.ndarray.INDArray |
preOutput(boolean training,
LayerWorkspaceMgr workspaceMgr) |
double |
score()
Objective function: the specified objective
|
void |
setBackpropGradientsViewArray(org.nd4j.linalg.api.ndarray.INDArray gradients)
Set the gradients array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParam(String key,
org.nd4j.linalg.api.ndarray.INDArray val)
Set the parameter with a new ndarray
|
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
protected void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params,
char order) |
void |
setParamsViewArray(org.nd4j.linalg.api.ndarray.INDArray params)
Set the initial parameters array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParamTable(Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable)
Setter for the param table
|
protected void |
setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z) |
String |
toString() |
Layer |
transpose()
Return a transposed copy of the weights/bias
(this means reverse the number of inputs and outputs on the weights)
|
void |
update(Gradient gradient)
Update layer weights and biases with gradient change
|
void |
update(org.nd4j.linalg.api.ndarray.INDArray gradient,
String paramType)
Perform one update applying the gradient
|
activate, addListeners, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, batchSize, conf, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, numParams, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, validateInputequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitgetEpochCount, getIterationCount, isPretrainLayer, setEpochCount, setIterationCountprotected org.nd4j.linalg.api.ndarray.INDArray paramsFlattened
protected org.nd4j.linalg.api.ndarray.INDArray gradientsFlattened
protected double score
protected ConvexOptimizer optimizer
protected Gradient gradient
protected Solver solver
public BaseLayer(NeuralNetConfiguration conf)
public BaseLayer(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public LayerConfT layerConf()
layerConf in class AbstractLayer<LayerConfT extends BaseLayer>public org.nd4j.linalg.primitives.Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic void fit()
Modelfit in interface Modelfit in class AbstractLayer<LayerConfT extends BaseLayer>public void computeGradientAndScore(LayerWorkspaceMgr workspaceMgr)
ModelcomputeGradientAndScore in interface ModelcomputeGradientAndScore in class AbstractLayer<LayerConfT extends BaseLayer>protected void setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z)
public double score()
score in interface Modelscore in class AbstractLayer<LayerConfT extends BaseLayer>public Gradient gradient()
ModelModel#computeGradientAndScore() .gradient in interface Modelgradient in class AbstractLayer<LayerConfT extends BaseLayer>public void update(Gradient gradient)
Modelupdate in interface Modelupdate in class AbstractLayer<LayerConfT extends BaseLayer>public void update(org.nd4j.linalg.api.ndarray.INDArray gradient,
String paramType)
Modelupdate in interface Modelupdate in class AbstractLayer<LayerConfT extends BaseLayer>gradient - the gradient to applypublic ConvexOptimizer getOptimizer()
ModelgetOptimizer in interface ModelgetOptimizer in class AbstractLayer<LayerConfT extends BaseLayer>public org.nd4j.linalg.api.ndarray.INDArray params()
params in interface Modelparams in class AbstractLayer<LayerConfT extends BaseLayer>public org.nd4j.linalg.api.ndarray.INDArray getParam(String param)
ModelgetParam in interface ModelgetParam in class AbstractLayer<LayerConfT extends BaseLayer>param - the key of the parameterpublic void setParam(String key, org.nd4j.linalg.api.ndarray.INDArray val)
ModelsetParam in interface ModelsetParam in class AbstractLayer<LayerConfT extends BaseLayer>key - the key to se tval - the new ndarraypublic void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
ModelsetParams in interface ModelsetParams in class AbstractLayer<LayerConfT extends BaseLayer>params - the parameters for the modelprotected void setParams(org.nd4j.linalg.api.ndarray.INDArray params,
char order)
setParams in class AbstractLayer<LayerConfT extends BaseLayer>public void setParamsViewArray(org.nd4j.linalg.api.ndarray.INDArray params)
ModelsetParamsViewArray in interface ModelsetParamsViewArray in class AbstractLayer<LayerConfT extends BaseLayer>params - a 1 x nParams row vector that is a view of the larger (MLN/CG) parameters arraypublic org.nd4j.linalg.api.ndarray.INDArray getGradientsViewArray()
getGradientsViewArray in interface ModelgetGradientsViewArray in class AbstractLayer<LayerConfT extends BaseLayer>public void setBackpropGradientsViewArray(org.nd4j.linalg.api.ndarray.INDArray gradients)
ModelsetBackpropGradientsViewArray in interface ModelsetBackpropGradientsViewArray in class AbstractLayer<LayerConfT extends BaseLayer>gradients - a 1 x nParams row vector that is a view of the larger (MLN/CG) gradients arraypublic void setParamTable(Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable)
ModelsetParamTable in interface ModelsetParamTable in class AbstractLayer<LayerConfT extends BaseLayer>public void initParams()
ModelinitParams in interface ModelinitParams in class AbstractLayer<LayerConfT extends BaseLayer>public Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable()
ModelparamTable in interface ModelparamTable in class AbstractLayer<LayerConfT extends BaseLayer>public Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable(boolean backpropParamsOnly)
ModelparamTable in interface ModelparamTable in class AbstractLayer<LayerConfT extends BaseLayer>backpropParamsOnly - If true, return backprop params only. If false: return all params (equivalent to
paramsTable())protected org.nd4j.linalg.api.ndarray.INDArray getParamWithNoise(String param, boolean training, LayerWorkspaceMgr workspaceMgr)
param - Parameter keytraining - If true: during trainingprotected org.nd4j.linalg.api.ndarray.INDArray preOutput(boolean training,
LayerWorkspaceMgr workspaceMgr)
public org.nd4j.linalg.api.ndarray.INDArray activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Layertraining - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic double calcL2(boolean backpropParamsOnly)
LayercalcL2 in interface LayercalcL2 in class AbstractLayer<LayerConfT extends BaseLayer>backpropParamsOnly - If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
LayercalcL1 in interface LayercalcL1 in class AbstractLayer<LayerConfT extends BaseLayer>backpropParamsOnly - If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public Layer clone()
Layerclone in interface Layerclone in class AbstractLayer<LayerConfT extends BaseLayer>public int numParams()
numParams in interface ModelnumParams in class AbstractLayer<LayerConfT extends BaseLayer>public void fit(org.nd4j.linalg.api.ndarray.INDArray input,
LayerWorkspaceMgr workspaceMgr)
Modelfit in interface Modelfit in class AbstractLayer<LayerConfT extends BaseLayer>input - the data to fit the model topublic Layer transpose()
Layertranspose in interface Layertranspose in class AbstractLayer<LayerConfT extends BaseLayer>public void accumulateScore(double accum)
ModelaccumulateScore in interface ModelaccumulateScore in class AbstractLayer<LayerConfT extends BaseLayer>accum - the amount to accumpublic void clear()
Modelclear in interface Modelclear in class AbstractLayer<LayerConfT extends BaseLayer>public void clearNoiseWeightParams()
public boolean hasBias()
Copyright © 2018. All rights reserved.