public class StackedDenoisingAutoEncoder extends BaseMultiLayerNetwork
Modifier and Type | Class and Description |
---|---|
static class |
StackedDenoisingAutoEncoder.Builder |
errorTolerance, layers, learningRateUpdate
Constructor and Description |
---|
StackedDenoisingAutoEncoder() |
StackedDenoisingAutoEncoder(int nIns,
int[] hiddenLayerSizes,
int nOuts,
int n_layers,
org.apache.commons.math3.random.RandomGenerator rng) |
StackedDenoisingAutoEncoder(int n_ins,
int[] hiddenLayerSizes,
int nOuts,
int nLayers,
org.apache.commons.math3.random.RandomGenerator rng,
org.jblas.DoubleMatrix input,
org.jblas.DoubleMatrix labels) |
Modifier and Type | Method and Description |
---|---|
NeuralNetwork |
createLayer(org.jblas.DoubleMatrix input,
int nVisible,
int nHidden,
org.jblas.DoubleMatrix W,
org.jblas.DoubleMatrix hbias,
org.jblas.DoubleMatrix vBias,
org.apache.commons.math3.random.RandomGenerator rng,
int index)
Creates a layer depending on the index.
|
NeuralNetwork[] |
createNetworkLayers(int numLayers) |
void |
pretrain(double lr,
double corruptionLevel,
int epochs) |
void |
pretrain(org.jblas.DoubleMatrix input,
double lr,
double corruptionLevel,
int epochs)
Unsupervised pretraining based on reconstructing the input
from a corrupted version
|
void |
trainNetwork(org.jblas.DoubleMatrix input,
org.jblas.DoubleMatrix labels,
Object[] otherParams)
Train the network running some unsupervised
pretraining followed by SGD/finetune
|
applyTransforms, asDecoder, backProp, backPropStep, clone, encode, fanIn, feedForward, finetune, finetune, getActivation, getColumnMeans, getColumnStds, getColumnSums, getDist, getErrorTolerance, getFanIn, getHiddenLayerSizes, getInput, getL2, getLabels, getLayers, getLearningRateUpdate, getLogLayer, getMomentum, getnIns, getnLayers, getnOuts, getOptimizer, getRenderWeightsEveryNEpochs, getRng, getSigmoidLayers, getSparsity, getWeightTransforms, initializeLayers, initializeNetwork, isForceNumEpochs, isShouldBackProp, isShouldInit, isToDecode, isUseRegularization, load, loadFromFile, merge, negativeLogLikelihood, predict, reconstruct, reconstruct, setActivation, setColumnMeans, setColumnStds, setColumnSums, setDist, setErrorTolerance, setFanIn, setForceNumEpochs, setHiddenLayerSizes, setInput, setL2, setLabels, setLayers, setLearningRateUpdate, setLogLayer, setMomentum, setnIns, setnLayers, setnOuts, setOptimizer, setRenderWeightsEveryNEpochs, setRng, setShouldBackProp, setShouldInit, setSigmoidLayers, setSparsity, setToDecode, setUseRegularization, setWeightTransforms, update, write
public StackedDenoisingAutoEncoder()
public StackedDenoisingAutoEncoder(int n_ins, int[] hiddenLayerSizes, int nOuts, int nLayers, org.apache.commons.math3.random.RandomGenerator rng, org.jblas.DoubleMatrix input, org.jblas.DoubleMatrix labels)
public StackedDenoisingAutoEncoder(int nIns, int[] hiddenLayerSizes, int nOuts, int n_layers, org.apache.commons.math3.random.RandomGenerator rng)
public void pretrain(double lr, double corruptionLevel, int epochs)
public void pretrain(org.jblas.DoubleMatrix input, double lr, double corruptionLevel, int epochs)
input
- the input to train onlr
- the starting learning ratecorruptionLevel
- the corruption level (the smaller number of inputs; the higher the
corruption level should be) the percent of inputs to corruptepochs
- the number of iterations to runpublic void trainNetwork(org.jblas.DoubleMatrix input, org.jblas.DoubleMatrix labels, Object[] otherParams)
BaseMultiLayerNetwork
trainNetwork
in class BaseMultiLayerNetwork
input
- input exampleslabels
- output labelsotherParams
- (double) learningRate
(double) corruptionLevel
(int) epochs
Optional:
(double) finetune lr
(int) finetune epochspublic NeuralNetwork createLayer(org.jblas.DoubleMatrix input, int nVisible, int nHidden, org.jblas.DoubleMatrix W, org.jblas.DoubleMatrix hbias, org.jblas.DoubleMatrix vBias, org.apache.commons.math3.random.RandomGenerator rng, int index)
BaseMultiLayerNetwork
CDBN
where the first layer needs to be an CRBM
for continuous inputs.
Please be sure to call super.initializeNetwork
to handle the passing of baseline parameters such as fanin
and rendering.createLayer
in class BaseMultiLayerNetwork
input
- the input to the layernVisible
- the number of visible inputsnHidden
- the number of hidden unitsW
- the weight vectorhbias
- the hidden biasvBias
- the visible biasrng
- the rng to use (THiS IS IMPORTANT; YOU DO NOT WANT TO HAVE A MIS REFERENCED RNG OTHERWISE NUMBERS WILL BE MEANINGLESS)index
- the index of the layerRBM
public NeuralNetwork[] createNetworkLayers(int numLayers)
createNetworkLayers
in class BaseMultiLayerNetwork
Copyright © 2014. All Rights Reserved.