%feature("docstring") OT::KrigingAlgorithm "Kriging algorithm. Refer to :ref:`kriging`. Available constructors: KrigingAlgorithm(*inputSample, outputSample, covarianceModel, basis*) KrigingAlgorithm(*inputSample, outputSample, covarianceModel, basisCollection*) Parameters ---------- inputSample, outputSample : 2-d sequence of float The samples :math:`(\vect{x}_k)_{1 \leq k \leq N} \in \Rset^d` and :math:`(\vect{y}_k)_{1 \leq k \leq N}\in \Rset^p` upon which the meta-model is built. covarianceModel : :class:`~openturns.CovarianceModel` Covariance model used for the underlying Gaussian process assumption. basis : :class:`~openturns.Basis` Functional basis to estimate the trend (universal kriging): :math:`(\varphi_j)_{1 \leq j \leq n_1}: \Rset^d \rightarrow \Rset`. If :math:`p>1`, the same basis is used for each marginal output. basisCollection : sequence of :class:`~openturns.Basis` Collection of :math:`p` functional basis: one basis for each marginal output: :math:`\left[(\varphi_j^1)_{1 \leq j \leq n_1}, \dots, (\varphi_j^p)_{1 \leq j \leq n_p}\right]`. If the sequence is empty, no trend coefficient is estimated (simple kriging). Notes ----- We suppose we have a sample :math:`(\vect{x}_k, \vect{y}_k)_{1 \leq k \leq N}` where :math:`\vect{y}_k = \cM(\vect{x}_k)` for all *k*, with :math:`\cM:\Rset^d \mapsto \Rset^p` the model. The meta model *Kriging* is based on the same principles as those of the general linear model: it assumes that the sample :math:`(\vect{y}_k)_{1 \leq k \leq N}` is considered as the trace of a Gaussian process :math:`\vect{Y}(\omega, \vect{x})` on :math:`(\vect{x}_k)_{1 \leq k \leq N}`. The Gaussian process :math:`\vect{Y}(\omega, \vect{x})` is defined by: .. math:: :label: metaModelKrigAlgo \vect{Y}(\omega, \vect{x}) = \vect{\mu}(\vect{x}) + W(\omega, \vect{x}) where: .. math:: \vect{\mu}(\vect{x}) = \left( \begin{array}{l} \mu_1(\vect{x}) \\ \dots \\ \mu_p(\vect{x}) \end{array} \right) with :math:`\mu_l(\vect{x}) = \sum_{j=1}^{n_l} \beta_j^l \varphi_j^l(\vect{x})` and :math:`\varphi_j^l: \Rset^d \rightarrow \Rset` the trend functions. :math:`W` is a Gaussian process of dimension *p* with zero mean and covariance function :math:`C = C(\vect{\theta}, \vect{\sigma}, \mat{R}, \vect{\lambda})` (see :class:`~openturns.CovarianceModel` for the notations). The estimation of the all parameters (the trend coefficients :math:`\beta_j^l`, the scale :math:`\vect{\theta}` and the amplitude :math:`\vect{\sigma}`) are made by the :class:`~openturns.GeneralLinearModelAlgorithm` class. The Kriging algorithm makes the general linear model interpolating on the input samples. The Kriging meta model :math:`\tilde{\cM}` is defined by: .. math:: \tilde{\cM}(\vect{x}) = \vect{\mu}(\vect{x}) + \Expect{\vect{Y}(\omega, \vect{x})\, | \, \cC} where :math:`\cC` is the condition :math:`\vect{Y}(\omega, \vect{x}_k) = \vect{y}_k` for each :math:`k \in [1, N]`. :eq:`metaModelKrigAlgo` writes: .. math:: \tilde{\cM}(\vect{x}) = \vect{\mu}(\vect{x}) + \Cov{\vect{Y}(\omega, \vect{x}), (\vect{Y}(\omega, \vect{x}_1), \dots, \vect{Y}(\omega, \vect{x}_N))} \vect{\gamma} where :math:`\Cov{\vect{Y}(\omega, \vect{x}), (\vect{Y}(\omega, \vect{x}_1), \dots, \vect{Y}(\omega, \vect{x}_N))} = \left( \mat{C}( \vect{x}, \vect{x}_1) | \dots | \mat{C}( \vect{x}, \vect{x}_N) \right)` is a matrix in :math:`\cM_{p,NP}(\Rset)` and :math:`\vect{\gamma} = \mat{C}^{-1}(\vect{y}-\vect{m})`. A known centered gaussian observation noise :math:`\epsilon_k` can be taken into account with :func:`setNoise()`: .. math:: \hat{\vect{y}}_k = \vect{y}_k + \epsilon_k, \epsilon_k \sim \mathcal{N}(0, \tau_k^2) Examples -------- Create the model :math:`\cM: \Rset \mapsto \Rset` and the samples: >>> import openturns as ot >>> f = ot.SymbolicFunction(['x'], ['x * sin(x)']) >>> sampleX = [[1.0], [2.0], [3.0], [4.0], [5.0], [6.0], [7.0], [8.0]] >>> sampleY = f(sampleX) Create the algorithm: >>> basis = ot.Basis([ot.SymbolicFunction(['x'], ['x']), ot.SymbolicFunction(['x'], ['x^2'])]) >>> covarianceModel = ot.SquaredExponential([1.0]) >>> covarianceModel.setActiveParameter([]) >>> algo = ot.KrigingAlgorithm(sampleX, sampleY, covarianceModel, basis) >>> algo.run() Get the resulting meta model: >>> result = algo.getResult() >>> metamodel = result.getMetaModel()" // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getOptimizationAlgorithm "Accessor to solver used to optimize the covariance model parameters. Returns ------- algorithm : :class:`~openturns.OptimizationAlgorithm` Solver used to optimize the covariance model parameters." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::setOptimizationAlgorithm "Accessor to the solver used to optimize the covariance model parameters. Parameters ---------- algorithm : :class:`~openturns.OptimizationAlgorithm` Solver used to optimize the covariance model parameters. Examples -------- Create the model :math:`\cM: \Rset \mapsto \Rset` and the samples: >>> import openturns as ot >>> input_data = ot.Uniform(-1.0, 2.0).getSample(10) >>> model = ot.SymbolicFunction(['x'], ['x-1+sin(pi_*x/(1+0.25*x^2))']) >>> output_data = model(input_data) Create the Kriging algorithm with the optimizer option: >>> basis = ot.Basis([ot.SymbolicFunction(['x'], ['0.0'])]) >>> thetaInit = 1.0 >>> covariance = ot.GeneralizedExponential([thetaInit], 2.0) >>> bounds = ot.Interval(1e-2,1e2) >>> algo = ot.KrigingAlgorithm(input_data, output_data, covariance, basis) >>> algo.setOptimizationBounds(bounds) " // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::setOptimizationBounds "Accessor to the optimization bounds. Parameters ---------- bounds : :class:`~openturns.Interval` The bounds used for numerical optimization of the likelihood. Notes ----- See :class:`~openturns.GeneralLinearModelAlgorithm` class for more details, particularly :meth:`~openturns.GeneralLinearModelAlgorithm.setOptimizationBounds`." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getOptimizationBounds "Accessor to the optimization bounds. Returns ------- problem : :class:`~openturns.Interval` The bounds used for numerical optimization of the likelihood." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getResult "Get the results of the metamodel computation. Returns ------- result : :class:`~openturns.KrigingResult` Structure containing all the results obtained after computation and created by the method :py:meth:`run`. " //----------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getInputSample "Accessor to the input sample. Returns ------- inputSample : :class:`~openturns.Sample` The input sample :math:`(\vect{x}_k)_{1 \leq k \leq N}`. " // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getOutputSample "Accessor to the output sample. Returns ------- outputSample : :class:`~openturns.Sample` The output sample :math:`(\vect{y}_k)_{1 \leq k \leq N}` . " // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getReducedLogLikelihoodFunction "Accessor to the reduced log-likelihood function that writes as argument of the covariance's model parameters. Returns ------- reducedLogLikelihood : :class:`~openturns.Function` The *potentially* reduced log-likelihood function. Notes ----- We use the same notations as in :class:`~openturns.CovarianceModel` and :class:`~openturns.GeneralLinearModelAlgorithm` : :math:`\vect{\theta}` refers to the scale parameters and :math:`\vect{\sigma}` the amplitude. We can consider three situtations here: - Output dimension is :math:`\geq 2`. In that case, we get the **full** log-likelihood function :math:`\mathcal{L}(\vect{\theta}, \vect{\sigma})`. - Output dimension is **1** and the `GeneralLinearModelAlgorithm-UseAnalyticalAmplitudeEstimate` key of :class:`~openturns.ResourceMap` is set to *True*. The amplitude parameter of the covariance model :math:`\vect{\theta}` is in the active set of parameters and thus we get the **reduced** log-likelihood function :math:`\mathcal{L}(\vect{\theta})`. - Output dimension is **1** and the `GeneralLinearModelAlgorithm-UseAnalyticalAmplitudeEstimate` key of :class:`~openturns.ResourceMap` is set to *False*. In that case, we get the **full** log-likelihood :math:`\mathcal{L}(\vect{\theta}, \vect{\sigma})`. The reduced log-likelihood function may be useful for some pre/postprocessing: vizualisation of the maximizer, use of an external optimizers to maximize the reduced log-likelihood etc. Examples -------- Create the model :math:`\cM: \Rset \mapsto \Rset` and the samples: >>> import openturns as ot >>> f = ot.SymbolicFunction(['x0'], ['x0 * sin(x0)']) >>> inputSample = ot.Sample([[1.0], [3.0], [5.0], [6.0], [7.0], [8.0]]) >>> outputSample = f(inputSample) Create the algorithm: >>> basis = ot.ConstantBasisFactory().build() >>> covarianceModel = ot.SquaredExponential(1) >>> algo = ot.KrigingAlgorithm(inputSample, outputSample, covarianceModel, basis) >>> algo.run() Get the reduced log-likelihood function: >>> reducedLogLikelihoodFunction = algo.getReducedLogLikelihoodFunction() " // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::run "Compute the response surface. Notes ----- It computes the kriging response surface and creates a :class:`~openturns.KrigingResult` structure containing all the results." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::setOptimizeParameters "Accessor to the covariance model parameters optimization flag. Parameters ---------- optimizeParameters : bool Whether to optimize the covariance model parameters." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getOptimizeParameters "Accessor to the covariance model parameters optimization flag. Returns ------- optimizeParameters : bool Whether to optimize the covariance model parameters." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::setNoise "Observation noise variance accessor. Parameters ---------- noise : sequence of positive float The noise variance :math:`\tau_k^2` of each output value." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getNoise "Observation noise variance accessor. Returns ------- noise : sequence of positive float The noise variance :math:`\tau_k^2` of each output value." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::getMethod "Linear algebra method accessor. Returns ------- method : str Used linear algebra method." // --------------------------------------------------------------------- %feature("docstring") OT::KrigingAlgorithm::setMethod "Linear algebra method set accessor. Parameters ---------- method : str Used linear algebra method. Value should be `LAPACK` or `HMAT` Notes ----- The setter update the implementation and require new evaluation. We might also use the ResourceMap key to set the method when instantiating the algorithm. For that purpose, we can use ResourceMap.SetAsString(`GeneralLinearModelAlgorithm-LinearAlgebra`, key) with `key` being `HMAT` or `LAPACK`."