Class ActivationThresholdedReLU

    • Constructor Detail

      • ActivationThresholdedReLU

        public ActivationThresholdedReLU()
      • ActivationThresholdedReLU

        public ActivationThresholdedReLU​(double theta)
    • Method Detail

      • getActivation

        public INDArray getActivation​(INDArray in,
                                      boolean training)
        Description copied from interface: IActivation
        Carry out activation function on the input array (usually known as 'preOut' or 'z') Implementations must overwrite "in", transform in place and return "in" Can support separate behaviour during test
        Parameters:
        in - input array.
        training - true when training.
        Returns:
        transformed activation
      • backprop

        public Pair<INDArray,​INDArray> backprop​(INDArray in,
                                                      INDArray epsilon)
        Description copied from interface: IActivation
        Backpropagate the errors through the activation function, given input z and epsilon dL/da.
        Returns 2 INDArrays:
        (a) The gradient dL/dz, calculated from dL/da, and
        (b) The parameter gradients dL/dW, where w is the weights in the activation function. For activation functions with no gradients, this will be null.
        Parameters:
        in - Input, before applying the activation function (z, or 'preOut')
        epsilon - Gradient to be backpropagated: dL/da, where L is the loss function
        Returns:
        dL/dz and dL/dW, for weights w (null if activation function has no weights)