weights for update
Activation Function
Activation Function
Forward computation
accumulated delta values
Sugar: reconstruction
Sugar: reconstruction
hidden layer output matrix
tuple of reconstruction output
Backpropagation of reconstruction.
Backpropagation of reconstruction. For the information about backpropagation calculation, see kr.ac.kaist.ir.deep.layer.Layer
error matrix to be propagated
input of this layer
final reconstruction output of this layer
propagated error
Translate this layer into JSON object (in Play! framework)
Translate this layer into JSON object (in Play! framework)
JSON object describes this layer
Backward computation.
Backward computation.
to be propagated ( dG / dF
is propagated from higher layer )
of this layer (in this case, x = entry of dX / dw
)
of this layer (in this case, y
)
propagated error (in this case, dG/dx
)
Let this layer have function F composed with function X(x) = W.x + b
and higher layer have function G.
Weight is updated with: dG/dW
and propagate dG/dx
For the computation, we only used denominator layout. (cf. Wikipedia Page of Matrix Computation) For the computation rules, see "Matrix Cookbook" from MIT.
Sugar: Forward computation.
Sugar: Forward computation. Calls apply(x)
input matrix
output matrix
Trait of Layer that can be used for autoencoder