1%feature("docstring") OT::EfficientGlobalOptimization
2"Efficient Global Optimization algorithm.
3
4The EGO algorithm [jones1998]_ is an adaptative optimization method based on
5kriging.
6An initial design of experiment is used to build a first metamodel.
7At each iteration a new point that maximizes a criterion is chosen as
8optimizer candidate.
9The criterion uses a tradeoff between the metamodel value and the conditional
10variance.
11Then the new point is evaluated using the original model and the metamodel is
12relearnt on the extended design of experiment.
13
14Available constructors:
15    EfficientGlobalOptimization(*problem, krigingResult*)
16
17Parameters
18----------
19problem : :class:`~openturns.OptimizationProblem`
20    The optimization problem to solve
21    optionally, a 2nd objective marginal can be used as noise
22krigingResult : :class:`~openturns.KrigingResult`
23    The result of the meta-model on the first design of experiment
24
25Notes
26-----
27Each point added to the metamodel design seeks to improve the current minimum.
28We chose the point so as to maximize an improvement criterion based on the
29metamodel.
30
31.. math::
32
33    I(x_{new}) = max(f_{min} - Y_{new}, 0)
34
35The default criteria is called EI (Expected Improvement) and aims at maximizing
36the mean improvement:
37
38.. math::
39
40    \mathbb{E}\left[I(x_{new})\right] = \mathbb{E}\left[max(f_{min} - Y_{new}, 0)\right]
41
42This criterion is explicited using the kriging mean and variance:
43
44.. math::
45
46    \mathbb{E}\left[I(x_{new})\right] = (f_{min} - m_K(x_{new})) \Phi\left( \frac{f_{min} - m_K(x_{new})}{s_K(x_{new})} \right) + s_K(x_{new}) \phi\left( \frac{f_{min} - m_K(x_{new})}{s_K(x_{new})} \right)
47
48An observation noise variance can be provided thanks to a 2nd objective marginal.
49
50.. math:: Y_{obs} = Y(x) + \sigma_{\epsilon}(x) \epsilon
51
52In that case the AEI (Augmented Expected Improvement) formulation is used.
53As we don't have access to the real minimum of the function anymore a quantile
54of the kriging prediction is used, with the constant :math:`c`:
55
56.. math:: u(x) = m_K(x) + c s_K(x)
57
58This criterion is minimized over the design points:
59
60.. math:: x_{min} = \argmax_{x_i} (u(x_i))
61
62The AEI criterion reads:
63
64.. math::
65
66    AEI(x_{new}) = \mathbb{E}\left[max(m_K(x_{min}) - Y_{new}, 0)\right] \times \left(1 - \frac{\sigma_{\epsilon}(x_{new})}{\sqrt{\sigma_{\epsilon}^2(x_{new})+s^2_K(x_{new})}} \right)
67
68with
69
70.. math::
71
72    \mathbb{E}\left[max(m_K(x_{min}) - Y_{new}, 0)\right] = (m_K(x_{min}) - m_K(x_{new})) \Phi\left( \frac{m_K(x_{min}) - m_K(x_{new})}{s_K(x_{new})} \right) + s_K(x_{new}) \phi\left( \frac{m_K(x_{min}) - m_K(x_{new})}{s_K(x_{new})} \right)
73
74A less computationally expensive noise function can be provided through
75:func:`setNoiseModel()` to evaluate :math:`\sigma^2_{\epsilon}(x)`
76for the improvement criterion optimization, the objective being only used to
77compute values and associated noise at design points.
78
79By default the criteria is minimized using :class:`~openturns.MultiStart`
80with starting points uniformly sampled in the optimization problem bounds,
81see :func:`setMultiStartExperimentSize` and :func:`setMultiStartNumber`.
82This behavior can be overridden by using another solver with :func:`setOptimizationAlgorithm`.
83
84Examples
85--------
86>>> import openturns as ot
87>>> ot.RandomGenerator.SetSeed(0)
88>>> dim = 4
89>>> model = ot.SymbolicFunction(['x1', 'x2', 'x3', 'x4'],
90...     ['x1*x1+x2^3*x1+x3+x4'])
91>>> model = ot.MemoizeFunction(model)
92>>> bounds = ot.Interval([-5.0] * dim, [5.0] * dim)
93>>> problem = ot.OptimizationProblem()
94>>> problem.setObjective(model)
95>>> problem.setBounds(bounds)
96>>> experiment = ot.Composite([0.0] * dim, [1.0, 2.0, 4.0])
97>>> inputSample = experiment.generate()
98>>> outputSample = model(inputSample)
99>>> print('Initial minimum output value: ', outputSample.getMin())
100Initial minimum output value:  [-248]
101>>> covarianceModel = ot.SquaredExponential([2.0] * dim, [0.1])
102>>> basis = ot.ConstantBasisFactory(dim).build()
103>>> kriging = ot.KrigingAlgorithm(inputSample, outputSample, covarianceModel, basis)
104>>> kriging.run()
105>>> algo = ot.EfficientGlobalOptimization(problem, kriging.getResult())
106>>> algo.setMaximumEvaluationNumber(2)
107>>> algo.run()
108>>> result = algo.getResult()
109>>> updatedKrigingResult = algo.getKrigingResult()
110>>> updatedOutputSample = updatedKrigingResult.getOutputSample()
111>>> print('Updated minimum output value: ', updatedOutputSample.getMin())
112Updated minimum output value:  [-610]"
113
114// ---------------------------------------------------------------------
115
116%feature("docstring") OT::EfficientGlobalOptimization::setOptimizationAlgorithm
117"Expected improvement solver accessor.
118
119Parameters
120----------
121solver : :class:`~openturns.OptimizationSolver`
122    The solver used to optimize the expected improvement"
123
124// ---------------------------------------------------------------------
125
126%feature("docstring") OT::EfficientGlobalOptimization::getOptimizationAlgorithm
127"Expected improvement solver accessor.
128
129Returns
130-------
131solver : :class:`~openturns.OptimizationSolver`
132    The solver used to optimize the expected improvement"
133
134// ---------------------------------------------------------------------
135
136%feature("docstring") OT::EfficientGlobalOptimization::setMultiStartExperimentSize
137"Size of the design to draw starting points.
138
139Parameters
140----------
141multiStartExperimentSize : int
142    The size of the Monte Carlo design from which to select the best starting
143    points.
144    The default number can be tweaked with the
145    `EfficientGlobalOptimization-DefaultMultiStartExperimentSize` key from
146    :class:`~openturns.ResourceMap`."
147
148// ---------------------------------------------------------------------
149
150%feature("docstring") OT::EfficientGlobalOptimization::getMultiStartExperimentSize
151"Size of the design to draw starting points.
152
153Returns
154-------
155multiStartExperimentSize : int
156    The size of the Monte Carlo design from which to select the best starting
157    points."
158
159// ---------------------------------------------------------------------
160
161%feature("docstring") OT::EfficientGlobalOptimization::setMultiStartNumber
162"Number of starting points for the criterion optimization.
163
164Parameters
165----------
166multiStartNumber : int
167    The number of starting points for the criterion optimization.
168    The default number can be tweaked with the
169    `EfficientGlobalOptimization-DefaultMultiStartNumber` key from
170    :class:`~openturns.ResourceMap`."
171
172// ---------------------------------------------------------------------
173
174%feature("docstring") OT::EfficientGlobalOptimization::getMultiStartNumber
175"Number of starting points for the criterion optimization.
176
177Returns
178-------
179multiStartNumber : int
180    The number of starting points for the criterion optimization."
181
182// ---------------------------------------------------------------------
183
184%feature("docstring") OT::EfficientGlobalOptimization::setParameterEstimationPeriod
185"Parameter estimation period accessor.
186
187Parameters
188----------
189period : int
190    The number of iterations between covariance parameters re-learn.
191    Default is 1 (each iteration). Can be set to 0 (never).
192    The default number can be tweaked with the
193    `EfficientGlobalOptimization-DefaultParameterEstimationPeriod` key from
194    :class:`~openturns.ResourceMap`."
195
196// ---------------------------------------------------------------------
197
198%feature("docstring") OT::EfficientGlobalOptimization::getParameterEstimationPeriod
199"Parameter estimation period accessor.
200
201Returns
202-------
203period : int
204    The number of iterations between covariance parameters re-learn.
205    Default is 1 (each iteration). Can be set to 0 (never)."
206
207// ---------------------------------------------------------------------
208
209%feature("docstring") OT::EfficientGlobalOptimization::setImprovementFactor
210"Improvement criterion factor accessor.
211
212Parameters
213----------
214alpha : positive float, default=0.0 (disables the criterion)
215    Used to define a stopping criterion on the improvement criterion:
216    :math:`I_{max} < \alpha |Y_{min}|`
217    with :math:`I_{max}` the current maximum of the improvement
218    and :math:`Y_{min}` the current optimum."
219
220// ---------------------------------------------------------------------
221
222%feature("docstring") OT::EfficientGlobalOptimization::getImprovementFactor
223"Improvement criterion factor accessor.
224
225Returns
226-------
227alpha : positive float, default=0.0 (disables the criterion)
228    Used to define a stopping criterion on the improvement criterion:
229    :math:`I_{max} < \alpha |Y_{min}|`
230    with :math:`I_{max}` the current maximum of the improvement
231    and :math:`Y_{min}` the current optimum."
232
233// ---------------------------------------------------------------------
234
235%feature("docstring") OT::EfficientGlobalOptimization::setCorrelationLengthFactor
236"Correlation length stopping criterion factor accessor.
237
238When a correlation length becomes smaller than the minimal distance between
239design point for a single component that means the model tends to be noisy,
240and the EGO formulation is not adapted anymore.
241
242Parameters
243----------
244b : float
245    Used to define a stopping criterion on the minimum correlation length:
246    :math:`\theta_i < \frac{\Delta_i^{min}}{b}`
247    with :math:`\Delta^{min}` the minimum distance between design points."
248
249// ---------------------------------------------------------------------
250
251%feature("docstring") OT::EfficientGlobalOptimization::getCorrelationLengthFactor
252"Correlation length stopping criterion factor accessor.
253
254When a correlation length becomes smaller than the minimal distance between
255design point for a single component that means the model tends to be noisy,
256and the EGO formulation is not adapted anymore.
257
258Returns
259-------
260b : float
261    Used to define a stopping criterion on the minimum correlation length:
262    :math:`\theta_i < \frac{\Delta_i^{min}}{b}`
263    with :math:`\Delta^{min}` the minimum distance between design points."
264
265// ---------------------------------------------------------------------
266
267%feature("docstring") OT::EfficientGlobalOptimization::setAEITradeoff
268"AEI tradeoff constant accessor.
269
270Parameters
271----------
272c : float
273    Used to define a quantile of the kriging prediction at the design points.
274    :math:`u(x)=m_K(x)+c*s_K(x)`"
275
276// ---------------------------------------------------------------------
277
278%feature("docstring") OT::EfficientGlobalOptimization::getAEITradeoff
279"AEI tradeoff constant accessor.
280
281Returns
282-------
283c : float
284    Used to define a quantile of the kriging prediction at the design points.
285    :math:`u(x)=m_K(x)+c*s_K(x)`"
286
287// ---------------------------------------------------------------------
288
289%feature("docstring") OT::EfficientGlobalOptimization::getExpectedImprovement
290"Expected improvement values.
291
292Returns
293-------
294ei : :class:`~openturns.Sample`
295    The expected improvement optimal values."
296
297// ---------------------------------------------------------------------
298
299%feature("docstring") OT::EfficientGlobalOptimization::setNoiseModel
300"Improvement noise model accessor.
301
302Parameters
303----------
304noiseVariance : :class:`~openturns.Function`
305    The noise variance :math:`\sigma^2_{\epsilon}(x)` used for the AEI
306    criterion optimization only.
307    Of same input dimension than the objective and 1-d output."
308
309// ---------------------------------------------------------------------
310
311%feature("docstring") OT::EfficientGlobalOptimization::getNoiseModel
312"Improvement noise model accessor.
313
314Returns
315-------
316noiseVariance : :class:`~openturns.Function`
317    The noise variance :math:`\sigma^2_{\epsilon}(x)` used for the AEI
318    criterion optimization only.
319    Of same input dimension than the objective and 1-d output."
320
321// ---------------------------------------------------------------------
322
323%feature("docstring") OT::EfficientGlobalOptimization::getKrigingResult
324"Retrieve the Kriging result.
325
326Notes
327-----
328Before :meth:`run` is called, this method returns the
329:class:`~openturns.KrigingResult` passed to the constructor.
330Once :meth:`run` has been called, it returns an updated
331:class:`~openturns.KrigingResult` that takes new observations into account.
332
333Returns
334-------
335krigingResult : :class:`~openturns.KrigingResult`
336    Kriging result that takes all observations into account."
337