A reconstructable layer for this network
the probability of non-dropped neurons (for drop-out training). (default : 100% = 1.0)
All weights of layers
Compute output of neural network with given input (without reconstruction) If drop-out is used, to average drop-out effect, we need to multiply output by presence probability.
Compute output of neural network with given input (without reconstruction) If drop-out is used, to average drop-out effect, we need to multiply output by presence probability.
an input vector
output of the vector
All accumulated delta weights of layers
All accumulated delta weights of layers
all accumulated delta weights
Decode computation for training.
Decode computation for training. If drop-out is used, we need to drop-out entry of input vector.
hidden values
output matrix
Backpropagation algorithm for decoding phrase
Backpropagation algorithm for decoding phrase
backpropagated error from error function
Encode computation for training.
Encode computation for training. If drop-out is used, we need to drop-out entry of input vector.
input matrix
hidden values
Backpropagation algorithm for encoding phrase
Backpropagation algorithm for encoding phrase
backpropagated error from error function
Collected input & output of each layer
Collected input & output of each layer
Forward computation for training.
Forward computation for training. If drop-out is used, we need to drop-out entry of input vector.
input matrix
output matrix
A reconstructable layer for this network
Sugar: Forward computation for validation.
Sugar: Forward computation for validation. Calls apply(x)
input matrix
output matrix
Reconstruct the given hidden value
Reconstruct the given hidden value
hidden value to be reconstructed.
reconstruction value.
Serialize network to JSON
Backpropagation algorithm
Backpropagation algorithm
backpropagated error from error function
Network: Single-layer Autoencoder