Class TextGenerationJobConfig

    • Method Detail

      • completionCriteria

        public final AutoMLJobCompletionCriteria completionCriteria()

        How long a fine-tuning job is allowed to run. For TextGenerationJobConfig problem types, the MaxRuntimePerTrainingJobInSeconds attribute of AutoMLJobCompletionCriteria defaults to 72h (259200s).

        Returns:
        How long a fine-tuning job is allowed to run. For TextGenerationJobConfig problem types, the MaxRuntimePerTrainingJobInSeconds attribute of AutoMLJobCompletionCriteria defaults to 72h (259200s).
      • baseModelName

        public final String baseModelName()

        The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName is provided, the default model used is Falcon7BInstruct.

        Returns:
        The name of the base model to fine-tune. Autopilot supports fine-tuning a variety of large language models. For information on the list of supported models, see Text generation models supporting fine-tuning in Autopilot. If no BaseModelName is provided, the default model used is Falcon7BInstruct.
      • hasTextGenerationHyperParameters

        public final boolean hasTextGenerationHyperParameters()
        For responses, this returns true if the service returned a value for the TextGenerationHyperParameters property. This DOES NOT check that the value is non-empty (for which, you should check the isEmpty() method on the property). This is useful because the SDK will never return a null collection or map, but you may need to differentiate between the service returning nothing (or null) and the service returning an empty collection or map. For requests, this returns true if a value for the property was specified in the request builder, and false if a value was not specified.
      • textGenerationHyperParameters

        public final Map<String,​String> textGenerationHyperParameters()

        The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.

        • "epochCount": The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of "1" to "10".

        • "batchSize": The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of "1" to "64".

        • "learningRate": The step size at which a model's parameters are updated during training. Its value should be a string containing a floating-point value within the range of "0" to "1".

        • "learningRateWarmupSteps": The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of "0" to "250".

        Here is an example where all four hyperparameters are configured.

        { "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }

        Attempts to modify the collection returned by this method will result in an UnsupportedOperationException.

        This method will never return null. If you would like to know whether the service returned this field (so that you can differentiate between null and empty), you can use the hasTextGenerationHyperParameters() method.

        Returns:
        The hyperparameters used to configure and optimize the learning process of the base model. You can set any combination of the following hyperparameters for all base models. For more information on each supported hyperparameter, see Optimize the learning process of your text generation models with hyperparameters.

        • "epochCount": The number of times the model goes through the entire training dataset. Its value should be a string containing an integer value within the range of "1" to "10".

        • "batchSize": The number of data samples used in each iteration of training. Its value should be a string containing an integer value within the range of "1" to "64".

        • "learningRate": The step size at which a model's parameters are updated during training. Its value should be a string containing a floating-point value within the range of "0" to "1".

        • "learningRateWarmupSteps": The number of training steps during which the learning rate gradually increases before reaching its target or maximum value. Its value should be a string containing an integer value within the range of "0" to "250".

        Here is an example where all four hyperparameters are configured.

        { "epochCount":"5", "learningRate":"0.5", "batchSize": "32", "learningRateWarmupSteps": "10" }

      • modelAccessConfig

        public final ModelAccessConfig modelAccessConfig()
        Returns the value of the ModelAccessConfig property for this object.
        Returns:
        The value of the ModelAccessConfig property for this object.
      • hashCode

        public final int hashCode()
        Overrides:
        hashCode in class Object
      • equals

        public final boolean equals​(Object obj)
        Overrides:
        equals in class Object
      • toString

        public final String toString()
        Returns a string representation of this object. This is useful for testing and debugging. Sensitive data will be redacted from this string using a placeholder value.
        Overrides:
        toString in class Object
      • getValueForField

        public final <T> Optional<T> getValueForField​(String fieldName,
                                                      Class<T> clazz)