public class GaussianRectifiedLinearDBN extends DBN
Modifier and Type | Class and Description |
---|---|
static class |
GaussianRectifiedLinearDBN.Builder |
dropOut, errorTolerance, gradientListeners, learningRateUpdate, lossFunction, multiLayerGradientListeners, normalizeByInputRows, optimizationAlgorithm
Constructor and Description |
---|
GaussianRectifiedLinearDBN() |
GaussianRectifiedLinearDBN(int nIns,
int[] hiddenLayerSizes,
int nOuts,
int nLayers,
org.apache.commons.math3.random.RandomGenerator rng) |
GaussianRectifiedLinearDBN(int nIn,
int[] hiddenLayerSizes,
int nOuts,
int nLayers,
org.apache.commons.math3.random.RandomGenerator rng,
org.jblas.DoubleMatrix input,
org.jblas.DoubleMatrix labels) |
Modifier and Type | Method and Description |
---|---|
HiddenLayer |
createHiddenLayer(int index,
int nIn,
int nOut,
ActivationFunction activation,
org.apache.commons.math3.random.RandomGenerator rng,
org.jblas.DoubleMatrix layerInput,
org.apache.commons.math3.distribution.RealDistribution dist)
Creates a hidden layer with the given parameters.
|
NeuralNetwork |
createLayer(org.jblas.DoubleMatrix input,
int nVisible,
int nHidden,
org.jblas.DoubleMatrix W,
org.jblas.DoubleMatrix hBias,
org.jblas.DoubleMatrix vBias,
org.apache.commons.math3.random.RandomGenerator rng,
int index)
Creates a layer depending on the index.
|
NeuralNetwork[] |
createNetworkLayers(int numLayers) |
pretrain, pretrain, pretrain, trainNetwork
applyTransforms, asDecoder, backProp, backPropStep, clone, encode, fanIn, feedForward, feedForward, finetune, finetune, getActivation, getColumnMeans, getColumnStds, getColumnSums, getDist, getDropOut, getErrorTolerance, getFanIn, getGradient, getHiddenBiasTransforms, getHiddenLayerSizes, getInput, getL2, getLabels, getLayers, getLearningRateUpdate, getLogLayer, getLossFunction, getMomentum, getnIns, getnLayers, getnOuts, getOptimizationAlgorithm, getOptimizer, getReconstructionCrossEntropy, getRenderWeightsEveryNEpochs, getRng, getSigmoidLayers, getSparsity, getVisibleBiasTransforms, getWeightTransforms, init, initialize, initializeLayers, initializeNetwork, isForceNumEpochs, isNormalizeByInputRows, isShouldBackProp, isShouldInit, isToDecode, isUseAdaGrad, isUseHiddenActivationsForwardProp, isUseRegularization, load, loadFromFile, merge, negativeLogLikelihood, predict, reconstruct, reconstruct, resetAdaGrad, setActivation, setColumnMeans, setColumnStds, setColumnSums, setDist, setDropOut, setErrorTolerance, setFanIn, setForceNumEpochs, setHiddenLayerSizes, setInput, setL2, setLabels, setLayers, setLearningRateUpdate, setLogLayer, setLossFunction, setMomentum, setnIns, setnLayers, setNormalizeByInputRows, setnOuts, setOptimizationAlgorithm, setOptimizer, setRenderWeightsEveryNEpochs, setRng, setShouldBackProp, setShouldInit, setSigmoidLayers, setSparsity, setToDecode, setUseAdaGrad, setUseHiddenActivationsForwardProp, setUseRegularization, setWeightTransforms, synchonrizeRng, update, write
public GaussianRectifiedLinearDBN()
public GaussianRectifiedLinearDBN(int nIn, int[] hiddenLayerSizes, int nOuts, int nLayers, org.apache.commons.math3.random.RandomGenerator rng, org.jblas.DoubleMatrix input, org.jblas.DoubleMatrix labels)
public GaussianRectifiedLinearDBN(int nIns, int[] hiddenLayerSizes, int nOuts, int nLayers, org.apache.commons.math3.random.RandomGenerator rng)
public NeuralNetwork createLayer(org.jblas.DoubleMatrix input, int nVisible, int nHidden, org.jblas.DoubleMatrix W, org.jblas.DoubleMatrix hBias, org.jblas.DoubleMatrix vBias, org.apache.commons.math3.random.RandomGenerator rng, int index)
BaseMultiLayerNetwork
CDBN
where the first layer needs to be an CRBM
for continuous inputs.
Please be sure to call super.initializeNetwork
to handle the passing of baseline parameters such as fanin
and rendering.createLayer
in class DBN
input
- the input to the layernVisible
- the number of visible inputsnHidden
- the number of hidden unitsW
- the weight vectorhBias
- the hidden biasvBias
- the visible biasrng
- the rng to use (THiS IS IMPORTANT; YOU DO NOT WANT TO HAVE A MIS REFERENCED RNG OTHERWISE NUMBERS WILL BE MEANINGLESS)index
- the index of the layerRBM
public HiddenLayer createHiddenLayer(int index, int nIn, int nOut, ActivationFunction activation, org.apache.commons.math3.random.RandomGenerator rng, org.jblas.DoubleMatrix layerInput, org.apache.commons.math3.distribution.RealDistribution dist)
createHiddenLayer
in class BaseMultiLayerNetwork
nIn
- the number of inputsnOut
- the number of outputsactivation
- the activation function for the layerrng
- the rng to use for samplinglayerInput
- the layer starting inputdist
- the probability distribution to use
for generating weightspublic NeuralNetwork[] createNetworkLayers(int numLayers)
createNetworkLayers
in class DBN
Copyright © 2014. All Rights Reserved.