public abstract class MultilayerPerceptron
extends java.lang.Object
implements java.io.Serializable
Modifier and Type | Field and Description |
---|---|
protected double |
alpha
momentum factor
|
protected double |
eta
learning rate
|
protected double |
lambda
weight decay factor, which is also a regularization term.
|
protected Layer[] |
net
The hidden layers.
|
protected OutputLayer |
output
The output layer.
|
protected int |
p
The dimensionality of input data.
|
protected double[] |
target
The buffer to store desired target value of training instance.
|
Constructor and Description |
---|
MultilayerPerceptron(Layer... net)
Constructor.
|
Modifier and Type | Method and Description |
---|---|
protected void |
backpropagate(double[] x)
Propagates the errors back through the network.
|
double |
getLearningRate()
Returns the learning rate.
|
double |
getMomentum()
Returns the momentum factor.
|
double |
getWeightDecay()
Returns the weight decay factor.
|
protected void |
propagate(double[] x)
Propagates the signals through the neural network.
|
void |
setLearningRate(double eta)
Sets the learning rate.
|
void |
setMomentum(double alpha)
Sets the momentum factor.
|
void |
setWeightDecay(double lambda)
Sets the weight decay factor.
|
java.lang.String |
toString() |
protected void |
update()
Updates the weights.
|
protected int p
protected OutputLayer output
protected Layer[] net
protected double[] target
protected double eta
protected double alpha
protected double lambda
public MultilayerPerceptron(Layer... net)
net
- the layers from bottom to top.
The input layer should not be included.public java.lang.String toString()
toString
in class java.lang.Object
public void setLearningRate(double eta)
eta
- the learning rate.public void setMomentum(double alpha)
alpha
- the momentum factor.public void setWeightDecay(double lambda)
public double getLearningRate()
public double getMomentum()
public double getWeightDecay()
protected void propagate(double[] x)
protected void backpropagate(double[] x)
protected void update()