sttp.openai.requests.finetuning.Hyperparameters
See theHyperparameters companion object
case class Hyperparameters(batchSize: Option[Int], learningRateMultiplier: Option[Float], nEpochs: Option[Int], beta: Option[Float])
Value parameters
- batchSize
-
Number of examples in each batch. A larger batch size means that model parameters are updated less frequently, but with lower variance.
- beta
-
The beta value for the DPO method. A higher beta value will increase the weight of the penalty between the policy and reference model.
- learningRateMultiplier
-
Scaling factor for the learning rate. A smaller learning rate may be useful to avoid overfitting.
- nEpochs
-
The number of epochs to train the model for. An epoch refers to one full cycle through the training dataset.
Attributes
- Companion
- object
- Graph
-
- Supertypes
-
trait Serializabletrait Producttrait Equalsclass Objecttrait Matchableclass Any
Members list
In this article