01. Basic Tutorial
In this tutorial, you can learn how to:
- Define Search Space
- Optimize Objective Function
This tutorial describes how to optimize Hyperparameters using HyperOpt without having a mathematical understanding of any algorithm implemented in HyperOpt.
# Import HyperOpt Library
from hyperopt import tpe, hp, fmin
Declares a purpose function to optimize. In this tutorial, we will optimize a simple function called objective
, which is a simple quadratic function.
$$ y = (x-3)^2 + 2 $$
objective = lambda x: (x-3)**2 + 2
Now, let's visualize this objective function.
import matplotlib.pyplot as plt
import numpy as np
x = np.linspace(-10, 10, 100)
y = objective(x)
fig = plt.figure()
plt.plot(x, y)
plt.show()
We are trying to optimize the objective function by changing the HyperParameter $x$. That's why we will declare a search space for $x$. The functions related to the search space are implemented in hyperopt.hp
. The list is as follows.
hp.randint(label, upper)
orhp.randint(label, low, high)
hp.uniform(label, low, high)
hp.loguniform(label, low, high)
hp.normal(label, mu, sigma)
hp.lognormal(label, mu, sigma)
hp.quniform(label, low, high, q)
hp.qloguniform(label, low, high, q)
hp.qnormal(label, mu, sigma, q)
hp.qlognormal(label, mu, sigma, q)
hp.choice(label, list)
hp.pchoice(label, p_list)
withp_list
as a list of(probability, option)
pairshp.uniformint(label, low, high, q)
orhp.uniformint(label, low, high)
sinceq = 1.0
We will use the most basic hp.uniform
in this tutorial.
# Define the search space of x between -10 and 10.
space = hp.uniform('x', -10, 10)
Now, there's only one last step left. So far, we have defined a function of purpose, and we have defined a search space for $x$. Now we can search through the search space $x$ and find the value of $x$ that can optimize the objective function. HyperOpt performs it using fmin
.
best = fmin(
fn=objective, # Objective Function to optimize
space=space, # Hyperparameter's Search Space
algo=tpe.suggest, # Optimization algorithm
max_evals=1000 # Number of optimization attempts
)
print(best)
100%|██████████| 1000/1000 [00:04<00:00, 228.56trial/s, best loss: 2.000001036046408]
{'x': 3.0010178636491283}
The optimal $x$ value found by HyperOpt is approximately 3.0. This is very close to a solution of $y=(x-3)^2+2$.