HyperparametersΒΆ

Info

Hyperparameter optimization is a new feature available since version 0.6.0. In general, this is quite a challenging and computationally expensive topic, and only a few basics are presented in this guide. If you are interested in contributing or collaborating, please let us know to enrich this module with more robust and better features.

Most algoriths have hyperparameters. For some optimization methods the parameters are already defined and can directly be optimized. For instance, for Differential Evolution (DE) the parameters can be found by:

[1]:
import json
from pymoo.algorithms.soo.nonconvex.de import DE
from pymoo.core.parameters import get_params, flatten, set_params, hierarchical

algorithm = DE()
flatten(get_params(algorithm))
[1]:
{'mating.jitter': <pymoo.core.variable.Choice at 0x7ff763f82b20>,
 'mating.CR': <pymoo.core.variable.Real at 0x7ff763f82a90>,
 'mating.crossover': <pymoo.core.variable.Choice at 0x7ff763d910d0>,
 'mating.F': <pymoo.core.variable.Real at 0x7ff763f82a30>,
 'mating.n_diffs': <pymoo.core.variable.Choice at 0x7ff763f829d0>,
 'mating.selection': <pymoo.core.variable.Choice at 0x7ff763f829a0>}

If not provided directly, when initializing a HyperparameterProblem these variables are directly used for optimization.

Secondly, one needs to define what exactly should be optimized. For instance, for a single run on a problem (with a fixed random seed) using the well-known parameter optimization toolkit Optuna, the implementation may look as follows:

[2]:
from pymoo.algorithms.hyperparameters import SingleObjectiveSingleRun, HyperparameterProblem
from pymoo.algorithms.soo.nonconvex.g3pcx import G3PCX
from pymoo.algorithms.soo.nonconvex.optuna import Optuna
from pymoo.core.parameters import set_params, hierarchical
from pymoo.optimize import minimize
from pymoo.problems.single import Sphere

algorithm = G3PCX()

problem = Sphere(n_var=10)
n_evals = 500

performance = SingleObjectiveSingleRun(problem, termination=("n_evals", n_evals), seed=1)

res = minimize(HyperparameterProblem(algorithm, performance),
               Optuna(),
               termination=('n_evals', 50),
               seed=1,
               verbose=False)

hyperparams = res.X
print(hyperparams)
set_params(algorithm, hierarchical(hyperparams))

res = minimize(Sphere(), algorithm, termination=("n_evals", n_evals), seed=1)
print("Best solution found: \nX = %s\nF = %s" % (res.X, res.F))
{'mutation.eta': 24.211482075883815, 'mutation.prob': 0.564451333911765, 'crossover.zeta': 0.04900418643742088, 'crossover.eta': 0.0959206113008407, 'family_size': 5, 'n_parents': 4, 'n_offsprings': 1, 'pop_size': 82}
Best solution found:
X = [0.50001432 0.5001597  0.49995753 0.5001489  0.49988024 0.49995418
 0.50008367 0.49996053 0.49999811 0.500049  ]
F = [7.70894799e-08]

Of course, you can also directly use the MixedVariableGA available in our framework:

[3]:
from pymoo.algorithms.hyperparameters import SingleObjectiveSingleRun, HyperparameterProblem
from pymoo.algorithms.soo.nonconvex.g3pcx import G3PCX
from pymoo.algorithms.soo.nonconvex.optuna import Optuna
from pymoo.core.mixed import MixedVariableGA
from pymoo.core.parameters import set_params, hierarchical
from pymoo.optimize import minimize
from pymoo.problems.single import Sphere


algorithm = G3PCX()

problem = Sphere(n_var=10)
n_evals = 500

performance = SingleObjectiveSingleRun(problem, termination=("n_evals", n_evals), seed=1)

res = minimize(HyperparameterProblem(algorithm, performance),
               MixedVariableGA(pop_size=5),
               termination=('n_evals', 50),
               seed=1,
               verbose=False)

hyperparams = res.X
print(hyperparams)
set_params(algorithm, hierarchical(hyperparams))

res = minimize(Sphere(), algorithm, termination=("n_evals", n_evals), seed=1)
print("Best solution found: \nX = %s\nF = %s" % (res.X, res.F))
{'mutation.eta': 22.448761322938267, 'mutation.prob': 0.1862602113776709, 'crossover.zeta': 0.20835233371712414, 'crossover.eta': 0.17156692156188078, 'family_size': 10, 'n_parents': 4, 'n_offsprings': 3, 'pop_size': 55}
Best solution found:
X = [0.50005652 0.49996977 0.50002159 0.49994603 0.50000987 0.50005497
 0.49989805 0.50001285 0.49998918 0.50000379]
F = [2.12975516e-08]

Now, optimizing the parameters for a single random seed is often not desirable. And this is precisely what makes hyper-parameter optimization computationally expensive. So instead of using just a single random seed, we can use the MultiRun performance assessment to average over multiple runs as follows:

[4]:
from pymoo.algorithms.hyperparameters import HyperparameterProblem, MultiRun, stats_single_objective_mean
from pymoo.algorithms.soo.nonconvex.g3pcx import G3PCX
from pymoo.core.mixed import MixedVariableGA
from pymoo.core.parameters import set_params, hierarchical
from pymoo.optimize import minimize
from pymoo.problems.single import Sphere


algorithm = G3PCX()

problem = Sphere(n_var=10)
n_evals = 500
seeds = [5, 50, 500]

performance = MultiRun(problem, seeds=seeds, func_stats=stats_single_objective_mean, termination=("n_evals", n_evals))

res = minimize(HyperparameterProblem(algorithm, performance),
               MixedVariableGA(pop_size=5),
               termination=('n_evals', 50),
               seed=1,
               verbose=True)

hyperparams = res.X
print(hyperparams)
set_params(algorithm, hierarchical(hyperparams))

res = minimize(Sphere(), algorithm, termination=("n_evals", n_evals), seed=5)
print("Best solution found: \nX = %s\nF = %s" % (res.X, res.F))

=================================================
n_gen  |  n_eval  |     f_avg     |     f_min
=================================================
     1 |        5 |  0.0025603953 |  0.0000961170
     2 |       10 |  0.0002893318 |  0.0000961170
     3 |       15 |  0.0000951445 |  0.0000315862
     4 |       20 |  0.0000511226 |  4.818149E-06
     5 |       25 |  0.0000118047 |  2.104538E-06
     6 |       30 |  2.625696E-06 |  1.629535E-06
     7 |       35 |  1.937361E-06 |  1.376478E-06
     8 |       40 |  1.234679E-06 |  1.523107E-07
     9 |       45 |  1.085508E-06 |  1.523107E-07
    10 |       50 |  8.631620E-07 |  1.523107E-07
{'mutation.eta': 22.448761322938267, 'mutation.prob': 0.2184784888033633, 'crossover.zeta': 0.21330354636525817, 'crossover.eta': 0.12139887634192673, 'family_size': 7, 'n_parents': 4, 'n_offsprings': 4, 'pop_size': 68}
Best solution found:
X = [0.49997706 0.49993435 0.49997831 0.49998572 0.50008528 0.50003351
 0.49983664 0.49982966 0.4999344  0.50019165]
F = [1.10642165e-07]

Another way of performance measure is the number of evaluations until a specific goal has been reached. For single-objective optimization, such a goal is most likely until a minimum function value has been found. Thus, for the termination, we use MinimumFunctionValueTermination with a value of 1e-5. We run the method for each random seed until this value has been reached or at most 500 function evaluations have taken place. The performance is then measured by the average number of function evaluations (func_stats=stats_avg_nevals) to reach the goal.

[5]:
from pymoo.algorithms.hyperparameters import HyperparameterProblem, MultiRun, stats_avg_nevals
from pymoo.algorithms.soo.nonconvex.g3pcx import G3PCX
from pymoo.core.mixed import MixedVariableGA
from pymoo.core.parameters import set_params, hierarchical
from pymoo.core.termination import TerminateIfAny
from pymoo.optimize import minimize
from pymoo.problems.single import Sphere
from pymoo.termination.fmin import MinimumFunctionValueTermination
from pymoo.termination.max_eval import MaximumFunctionCallTermination

algorithm = G3PCX()

problem = Sphere(n_var=10)

termination = TerminateIfAny(MinimumFunctionValueTermination(1e-5), MaximumFunctionCallTermination(500))

performance = MultiRun(problem, seeds=[5, 50, 500], func_stats=stats_avg_nevals, termination=termination)

res = minimize(HyperparameterProblem(algorithm, performance),
               MixedVariableGA(pop_size=5),
               ('n_evals', 50),
               seed=1,
               verbose=True)

hyperparams = res.X
print(hyperparams)
set_params(algorithm, hierarchical(hyperparams))

res = minimize(Sphere(), algorithm, termination=("n_evals", res.f), seed=5)
print("Best solution found: \nX = %s\nF = %s" % (res.X, res.F))

=================================================
n_gen  |  n_eval  |     f_avg     |     f_min
=================================================
     1 |        5 |  5.298000E+02 |  5.030000E+02
     2 |       10 |  5.050000E+02 |  5.030000E+02
     3 |       15 |  5.034000E+02 |  5.010000E+02
     4 |       20 |  5.022000E+02 |  5.010000E+02
     5 |       25 |  5.010000E+02 |  5.010000E+02
     6 |       30 |  5.010000E+02 |  5.010000E+02
     7 |       35 |  5.010000E+02 |  5.010000E+02
     8 |       40 |  5.010000E+02 |  5.010000E+02
     9 |       45 |  5.010000E+02 |  5.010000E+02
    10 |       50 |  5.010000E+02 |  5.010000E+02
{'mutation.eta': 22.47869760772081, 'mutation.prob': 0.1862602113776709, 'crossover.zeta': 0.21468140012634795, 'crossover.eta': 0.1310183926864668, 'family_size': 10, 'n_parents': 3, 'n_offsprings': 4, 'pop_size': 21}
Best solution found:
X = [0.57274658 0.50314574 0.51366084 0.54297554 0.45386054 0.51049599
 0.4847838  0.50450093 0.50099743 0.47083067]
F = [0.01067813]