Constrained optimization
Constrained Bayesian Optimization¶
In this tutorial we demonstrate the use of Xopt to perform Bayesian Optimization on a simple test problem subject to a single constraint.
Define the test problem¶
Here we define a simple optimization problem, where we attempt to minimize the sin function in the domian [0,2*pi], subject to a cos constraining function.
# Ignore all warnings
import warnings
warnings.filterwarnings("ignore")
import time
import math
from xopt.vocs import VOCS
# define variables, function objective and constraining function
vocs = VOCS(
variables={"x": [0, 2 * math.pi]},
objectives={"f": "MINIMIZE"},
constraints={"c": ["LESS_THAN", 0]}
)
# define a test function to optimize
import numpy as np
def test_function(input_dict):
return {"f": np.sin(input_dict["x"]),"c": np.cos(input_dict["x"])}
Create Xopt objects¶
Create the evaluator to evaluate our test function and create a generator that uses the Expected Improvement acquisition function to perform Bayesian Optimization.
from xopt.evaluator import Evaluator
from xopt.generators.bayesian import ExpectedImprovementGenerator
from xopt import Xopt
evaluator = Evaluator(function=test_function)
generator = ExpectedImprovementGenerator(vocs=vocs)
X = Xopt(evaluator=evaluator, generator=generator, vocs=vocs)
Generate and evaluate initial points¶
To begin optimization, we must generate some random initial data points. The first call
to X.step()
will generate and evaluate a number of randomly points specified by the
generator. Note that if we add data to xopt before calling X.step()
by assigning
the data to X.data
, calls to X.step()
will ignore the random generation and
proceed to generating points via Bayesian optimization.
# call X.random_evaluate(n_samples) to generate + evaluate initial points
X.random_evaluate(n_samples=2)
# inspect the gathered data
X.data
x | f | c | xopt_runtime | xopt_error | |
---|---|---|---|---|---|
0 | 0.066292 | 0.066243 | 0.997804 | 0.000006 | False |
1 | 3.205246 | -0.063610 | -0.997975 | 0.000002 | False |
Do bayesian optimization steps¶
To perform optimization we simply call X.step()
in a loop. This allows us to do
intermediate tasks in between optimization steps, such as examining the model and
acquisition function at each step (as we demonstrate here).
import time
n_steps = 5
# test points for plotting
test_x = np.linspace(*X.vocs.bounds.flatten(), 50)
for i in range(n_steps):
start = time.perf_counter()
model = X.generator.train_model()
fig, ax = X.generator.visualize_model(n_grid=100)
print(time.perf_counter() - start)
# add ground truth functions to plots
out = test_function({"x": test_x})
ax[0].plot(test_x, out["f"], "C0-.")
ax[1].plot(test_x, out["c"], "C2-.")
# do the optimization step
X.step()
0.5273497909947764 0.14978720899671316 0.15450850001070648 0.15901533400756307 0.16169404098764062
# access the collected data
X.data
x | f | c | xopt_runtime | xopt_error | |
---|---|---|---|---|---|
0 | 0.066292 | 0.066243 | 0.997804 | 0.000006 | False |
1 | 3.205246 | -0.063610 | -0.997975 | 0.000002 | False |
2 | 4.544182 | -0.985887 | -0.167415 | 0.000006 | False |
3 | 5.992520 | -0.286590 | 0.958053 | 0.000005 | False |
4 | 4.663187 | -0.998790 | -0.049182 | 0.000007 | False |
5 | 4.693153 | -0.999815 | -0.019235 | 0.000006 | False |
6 | 4.699189 | -0.999913 | -0.013200 | 0.000009 | False |