Interface IUpdater

    • Method Detail

      • stateSize

        long stateSize​(long numParams)
        Determine the updater state size for the given number of parameters. Usually a integer multiple (0,1 or 2) times the number of parameters in a layer.
        Parameters:
        numParams - Number of parameters
        Returns:
        Updater state size for the given number of parameters
      • instantiate

        GradientUpdater instantiate​(INDArray viewArray,
                                    boolean initializeViewArray)
        Create a new gradient updater
        Parameters:
        viewArray - The updater state size view away
        initializeViewArray - If true: initialise the updater state
        Returns:
      • clone

        IUpdater clone()
        Clone the updater
      • getLearningRate

        double getLearningRate​(int iteration,
                               int epoch)
        Get the learning rate - if any - for the updater, at the specified iteration and epoch. Note that if no learning rate is applicable (AdaDelta, NoOp updaters etc) then Double.NaN should be return
        Parameters:
        iteration - Iteration at which to get the learning rate
        epoch - Epoch at which to get the learning rate
        Returns:
        Learning rate, or Double.NaN if no learning rate is applicable for this updater
      • hasLearningRate

        boolean hasLearningRate()
        Returns:
        True if the updater has a learning rate hyperparameter, false otherwise
      • setLrAndSchedule

        void setLrAndSchedule​(double lr,
                              ISchedule lrSchedule)
        Set the learning rate and schedule. Note: may throw an exception if hasLearningRate() returns false.
        Parameters:
        lr - Learning rate to set (typically not used if LR schedule is non-null)
        lrSchedule - Learning rate schedule to set (may be null)