.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples\strategy-comparison.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_strategy-comparison.py: ========================== Comparing surrogate models ========================== Tim Head, July 2016. Reformatted by Holger Nahrstaedt 2020 .. currentmodule:: skopt Bayesian optimization or sequential model-based optimization uses a surrogate model to model the expensive to evaluate function `func`. There are several choices for what kind of surrogate model to use. This notebook compares the performance of: * gaussian processes, * extra trees, and * random forests as surrogate models. A purely random optimization strategy is also used as a baseline. .. GENERATED FROM PYTHON SOURCE LINES 23-32 .. code-block:: Python print(__doc__) import numpy as np np.random.seed(123) import matplotlib.pyplot as plt from skopt.benchmarks import branin as _branin .. GENERATED FROM PYTHON SOURCE LINES 33-39 Toy model ========= We will use the :class:`benchmarks.branin` function as toy model for the expensive function. In a real world application this function would be unknown and expensive to evaluate. .. GENERATED FROM PYTHON SOURCE LINES 39-45 .. code-block:: Python def branin(x, noise_level=0.0): return _branin(x) + noise_level * np.random.randn() .. GENERATED FROM PYTHON SOURCE LINES 46-79 .. code-block:: Python from matplotlib.colors import LogNorm def plot_branin(): fig, ax = plt.subplots() x1_values = np.linspace(-5, 10, 100) x2_values = np.linspace(0, 15, 100) x_ax, y_ax = np.meshgrid(x1_values, x2_values) vals = np.c_[x_ax.ravel(), y_ax.ravel()] fx = np.reshape([branin(val) for val in vals], (100, 100)) cm = ax.pcolormesh( x_ax, y_ax, fx, norm=LogNorm(vmin=fx.min(), vmax=fx.max()), cmap='viridis_r' ) minima = np.array([[-np.pi, 12.275], [+np.pi, 2.275], [9.42478, 2.475]]) ax.plot(minima[:, 0], minima[:, 1], "r.", markersize=14, lw=0, label="Minima") cb = fig.colorbar(cm) cb.set_label("f(x)") ax.legend(loc="best", numpoints=1) ax.set_xlabel("X1") ax.set_xlim([-5, 10]) ax.set_ylabel("X2") ax.set_ylim([0, 15]) plot_branin() .. image-sg:: /auto_examples/images/sphx_glr_strategy-comparison_001.png :alt: strategy comparison :srcset: /auto_examples/images/sphx_glr_strategy-comparison_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 80-95 This shows the value of the two-dimensional branin function and the three minima. Objective ========= The objective of this example is to find one of these minima in as few iterations as possible. One iteration is defined as one call to the :class:`benchmarks.branin` function. We will evaluate each model several times using a different seed for the random number generator. Then compare the average performance of these models. This makes the comparison more robust against models that get "lucky". .. GENERATED FROM PYTHON SOURCE LINES 95-104 .. code-block:: Python from functools import partial from skopt import dummy_minimize, forest_minimize, gp_minimize func = partial(branin, noise_level=2.0) bounds = [(-5.0, 10.0), (0.0, 15.0)] n_calls = 60 .. GENERATED FROM PYTHON SOURCE LINES 105-125 .. code-block:: Python def run(minimizer, n_iter=5): return [ minimizer(func, bounds, n_calls=n_calls, random_state=n) for n in range(n_iter) ] # Random search dummy_res = run(dummy_minimize) # Gaussian processes gp_res = run(gp_minimize) # Random forest rf_res = run(partial(forest_minimize, base_estimator="RF")) # Extra trees et_res = run(partial(forest_minimize, base_estimator="ET")) .. GENERATED FROM PYTHON SOURCE LINES 126-127 Note that this can take a few minutes. .. GENERATED FROM PYTHON SOURCE LINES 127-141 .. code-block:: Python from skopt.plots import plot_convergence plot = plot_convergence( ("dummy_minimize", dummy_res), ("gp_minimize", gp_res), ("forest_minimize('rf')", rf_res), ("forest_minimize('et)", et_res), true_minimum=0.397887, yscale="log", ) plot.legend(loc="best", prop={'size': 6}, numpoints=1) .. image-sg:: /auto_examples/images/sphx_glr_strategy-comparison_002.png :alt: Convergence plot :srcset: /auto_examples/images/sphx_glr_strategy-comparison_002.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 142-155 This plot shows the value of the minimum found (y axis) as a function of the number of iterations performed so far (x axis). The dashed red line indicates the true value of the minimum of the :class:`benchmarks.branin` function. For the first ten iterations all methods perform equally well as they all start by creating ten random samples before fitting their respective model for the first time. After iteration ten the next point at which to evaluate :class:`benchmarks.branin` is guided by the model, which is where differences start to appear. Each minimizer only has access to noisy observations of the objective function, so as time passes (more iterations) it will start observing values that are below the true value simply because they are fluctuations. .. rst-class:: sphx-glr-timing **Total running time of the script:** (3 minutes 7.876 seconds) .. _sphx_glr_download_auto_examples_strategy-comparison.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/holgern/scikit-optimize/master?urlpath=lab/tree/notebooks/auto_examples/strategy-comparison.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: strategy-comparison.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: strategy-comparison.py ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_