Used to perform hyperparameters search via sampling.
Creates optimizer given the problem description.
Models parameter having limited set of values
Models a simple real valued parameter from the range [lower,upper]
Advanced sampler modeling the efficiency function as a family of Gaussian processes (with integrated kernel parameters) and sampling from it trying to maximize Expected Improvement.
Provides ability to search through multiple configurations in parallel mode, collecting all the stat and metrics.
Provides ability to search through multiple configurations in parallel mode, collecting all the stat and metrics.
Supports persisting temporary models in order to restore after failures, but only when used with StableOrderParamGridBuilder.
Utility used to perform stepwise search for hyper-parameters.
Utility used to perform stepwise search for hyper-parameters. Usefull in case if there are certain groups of parameters which do not influence each other and can be optimized separatelly. Pass sequence of optimizers to the grouped search to apply sequential optimization.
NB: Grouped search must itself be configured to use single thread, but nested optimizers allowed to use as many threads as they need.
Common summary block to store history of the hyperparameters search.
Common trait to all the hyper-parameter optimizers.
Models a ordinal valued parameter from the sequence {lower, lower + 1, ...
Models a ordinal valued parameter from the sequence {lower, lower + 1, ... , upper}
Parameters domain is used to map from the [0,1] value sampled from the optimizer to the actual parameter value.
Holds the actual SparkML param and its domain.
Holds the actual SparkML param and its domain. Support type-safe methods for moving data between optimizer, data frame and SparkML estimator.
Simple sampler from the Sobol points sequence
Builder for a param grid used in grid search-based model selection.
Builder for a param grid used in grid search-based model selection. This builder provdes stable order and thus might be reliablly used with persist temp models feature
Searches for optimal parameters using Bayesian approach.
Searches for optimal parameters using Bayesian approach. Important difference of this searcher compared to other forked estimators is the need to get previous evaluation to know where to sample next params.
Used to perform hyperparameters search via sampling. Sampling is performed from the [0,1]^^N hypercube and mapping from [0,1] to particular values are made by the parameter domains. Availible implementations of Bayessian optimizers are largely based on the LinkedIn Photon-ML project.