.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples\optimizer-with-different-base-estimator.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_optimizer-with-different-base-estimator.py: ============================================== Use different base estimators for optimization ============================================== Sigurd Carlen, September 2019. Reformatted by Holger Nahrstaedt 2020 .. currentmodule:: skopt To use different base_estimator or create a regressor with different parameters, we can create a regressor object and set it as kernel. This example uses :class:`plots.plot_gaussian_process` which is available since version 0.8. .. GENERATED FROM PYTHON SOURCE LINES 18-29 .. code-block:: Python print(__doc__) import numpy as np np.random.seed(1234) import matplotlib.pyplot as plt from skopt import Optimizer from skopt.plots import plot_gaussian_process .. GENERATED FROM PYTHON SOURCE LINES 30-34 Toy example ----------- Let assume the following noisy function :math:`f`: .. GENERATED FROM PYTHON SOURCE LINES 34-49 .. code-block:: Python noise_level = 0.1 # Our 1D toy problem, this is the function we are trying to # minimize def objective(x, noise_level=noise_level): return np.sin(5 * x[0]) * (1 - np.tanh(x[0] ** 2)) + np.random.randn() * noise_level def objective_wo_noise(x): return objective(x, noise_level=0) .. GENERATED FROM PYTHON SOURCE LINES 50-59 .. code-block:: Python opt_gp = Optimizer( [(-2.0, 2.0)], base_estimator="GP", n_initial_points=5, acq_optimizer="sampling", random_state=42, ) .. GENERATED FROM PYTHON SOURCE LINES 60-102 .. code-block:: Python def plot_optimizer(res, n_iter, max_iters=5): if n_iter == 0: show_legend = True else: show_legend = False ax = plt.subplot(max_iters, 2, 2 * n_iter + 1) # Plot GP(x) + contours ax = plot_gaussian_process( res, ax=ax, objective=objective_wo_noise, noise_level=noise_level, show_legend=show_legend, show_title=True, show_next_point=False, show_acq_func=False, ) ax.set_ylabel("") ax.set_xlabel("") if n_iter < max_iters - 1: ax.get_xaxis().set_ticklabels([]) # Plot EI(x) ax = plt.subplot(max_iters, 2, 2 * n_iter + 2) ax = plot_gaussian_process( res, ax=ax, noise_level=noise_level, show_legend=show_legend, show_title=False, show_next_point=True, show_acq_func=True, show_observations=False, show_mu=False, ) ax.set_ylabel("") ax.set_xlabel("") if n_iter < max_iters - 1: ax.get_xaxis().set_ticklabels([]) .. GENERATED FROM PYTHON SOURCE LINES 103-105 GP kernel --------- .. GENERATED FROM PYTHON SOURCE LINES 105-117 .. code-block:: Python fig = plt.figure() fig.suptitle("Standard GP kernel") for i in range(10): next_x = opt_gp.ask() f_val = objective(next_x) res = opt_gp.tell(next_x, f_val) if i >= 5: plot_optimizer(res, n_iter=i - 5, max_iters=5) plt.tight_layout(rect=[0, 0.03, 1, 0.95]) plt.plot() .. image-sg:: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_001.png :alt: Standard GP kernel, x* = -0.2167, f(x*) = -0.9141, x* = -0.2167, f(x*) = -0.9141, x* = -0.2167, f(x*) = -0.9141, x* = -0.2167, f(x*) = -0.9141, x* = -0.2167, f(x*) = -0.9141 :srcset: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none [] .. GENERATED FROM PYTHON SOURCE LINES 118-120 Test different kernels ---------------------- .. GENERATED FROM PYTHON SOURCE LINES 120-150 .. code-block:: Python from sklearn.gaussian_process.kernels import ( RBF, ConstantKernel, DotProduct, ExpSineSquared, Matern, RationalQuadratic, ) from skopt.learning import GaussianProcessRegressor from skopt.learning.gaussian_process.kernels import ConstantKernel, Matern # Gaussian process with Matérn kernel as surrogate model kernels = [ (1.0 * RBF(length_scale=1.0, length_scale_bounds=(1e-1, 10.0)), "RBF"), (1.0 * RationalQuadratic(length_scale=1.0, alpha=0.1), "RationalQuadratic"), (1.0 * ExpSineSquared( length_scale=1.0, periodicity=3.0, length_scale_bounds=(0.1, 10.0), periodicity_bounds=(1.0, 10.0), ), "ExpSineSquared"), # (ConstantKernel(0.1, (0.01, 10.0)) # * (DotProduct(sigma_0=1.0, sigma_0_bounds=(0.1, 10.0)) ** 2), "ConstantKernel"), (1.0 * Matern(length_scale=1.0, length_scale_bounds=(1e-1, 10.0), nu=2.5), "Matern"), ] .. GENERATED FROM PYTHON SOURCE LINES 151-177 .. code-block:: Python for kernel, label in kernels: gpr = GaussianProcessRegressor( kernel=kernel, alpha=noise_level**2, normalize_y=True, noise="gaussian", n_restarts_optimizer=2, ) opt = Optimizer( [(-2.0, 2.0)], base_estimator=gpr, n_initial_points=5, acq_optimizer="sampling", random_state=42, ) fig = plt.figure() fig.suptitle(label) for i in range(10): next_x = opt.ask() f_val = objective(next_x) res = opt.tell(next_x, f_val) if i >= 5: plot_optimizer(res, n_iter=i - 5, max_iters=5) plt.tight_layout(rect=[0, 0.03, 1, 0.95]) plt.show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_002.png :alt: RBF, x* = -0.5018, f(x*) = -0.4236, x* = -0.5018, f(x*) = -0.4236, x* = -0.5018, f(x*) = -0.4236, x* = -0.5018, f(x*) = -0.4236, x* = -0.5018, f(x*) = -0.4236 :srcset: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_002.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_003.png :alt: RationalQuadratic, x* = -0.5018, f(x*) = -0.4792, x* = -0.5018, f(x*) = -0.4792, x* = -0.5018, f(x*) = -0.4792, x* = -0.5018, f(x*) = -0.4792, x* = -0.3767, f(x*) = -0.8734 :srcset: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_003.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_004.png :alt: ExpSineSquared, x* = -0.5018, f(x*) = -0.4078, x* = -0.5018, f(x*) = -0.4078, x* = -0.5018, f(x*) = -0.4078, x* = -0.2591, f(x*) = -1.0230, x* = -0.2591, f(x*) = -1.0230 :srcset: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_004.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_005.png :alt: Matern, x* = -0.5018, f(x*) = -0.5936, x* = -0.5018, f(x*) = -0.5936, x* = -0.5018, f(x*) = -0.5936, x* = -0.5018, f(x*) = -0.5936, x* = -0.5018, f(x*) = -0.5936 :srcset: /auto_examples/images/sphx_glr_optimizer-with-different-base-estimator_005.png :class: sphx-glr-multi-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 13.506 seconds) .. _sphx_glr_download_auto_examples_optimizer-with-different-base-estimator.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/holgern/scikit-optimize/master?urlpath=lab/tree/notebooks/auto_examples/optimizer-with-different-base-estimator.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: optimizer-with-different-base-estimator.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: optimizer-with-different-base-estimator.py ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_