1% Generated by roxygen2: do not edit by hand
2% Please edit documentation in R/global.R
3\name{crs2lm}
4\alias{crs2lm}
5\title{Controlled Random Search}
6\usage{
7crs2lm(x0, fn, lower, upper, maxeval = 10000, pop.size = 10 *
8  (length(x0) + 1), ranseed = NULL, xtol_rel = 1e-06,
9  nl.info = FALSE, ...)
10}
11\arguments{
12\item{x0}{initial point for searching the optimum.}
13
14\item{fn}{objective function that is to be minimized.}
15
16\item{lower, upper}{lower and upper bound constraints.}
17
18\item{maxeval}{maximum number of function evaluations.}
19
20\item{pop.size}{population size.}
21
22\item{ranseed}{prescribe seed for random number generator.}
23
24\item{xtol_rel}{stopping criterion for relative change reached.}
25
26\item{nl.info}{logical; shall the original NLopt info been shown.}
27
28\item{...}{additional arguments passed to the function.}
29}
30\value{
31List with components:
32  \item{par}{the optimal solution found so far.}
33  \item{value}{the function value corresponding to \code{par}.}
34  \item{iter}{number of (outer) iterations, see \code{maxeval}.}
35  \item{convergence}{integer code indicating successful completion (> 0)
36    or a possible error number (< 0).}
37  \item{message}{character string produced by NLopt and giving additional
38    information.}
39}
40\description{
41The Controlled Random Search (CRS) algorithm (and in particular, the CRS2
42variant) with the `local mutation' modification.
43}
44\details{
45The CRS algorithms are sometimes compared to genetic algorithms, in that
46they start with a random population of points, and randomly evolve these
47points by heuristic rules. In this case, the evolution somewhat resembles a
48randomized Nelder-Mead algorithm.
49
50The published results for CRS seem to be largely empirical.
51}
52\note{
53The initial population size for CRS defaults to \code{10x(n+1)} in
54\code{n} dimensions, but this can be changed; the initial population must be
55at least \code{n+1}.
56}
57\examples{
58
59### Minimize the Hartmann6 function
60hartmann6 <- function(x) {
61    n <- length(x)
62    a <- c(1.0, 1.2, 3.0, 3.2)
63    A <- matrix(c(10.0,  0.05, 3.0, 17.0,
64                   3.0, 10.0,  3.5,  8.0,
65                  17.0, 17.0,  1.7,  0.05,
66                   3.5,  0.1, 10.0, 10.0,
67                   1.7,  8.0, 17.0,  0.1,
68                   8.0, 14.0,  8.0, 14.0), nrow=4, ncol=6)
69    B  <- matrix(c(.1312,.2329,.2348,.4047,
70                   .1696,.4135,.1451,.8828,
71                   .5569,.8307,.3522,.8732,
72                   .0124,.3736,.2883,.5743,
73                   .8283,.1004,.3047,.1091,
74                   .5886,.9991,.6650,.0381), nrow=4, ncol=6)
75    fun <- 0.0
76    for (i in 1:4) {
77        fun <- fun - a[i] * exp(-sum(A[i,]*(x-B[i,])^2))
78    }
79    return(fun)
80}
81
82S <- mlsl(x0 = rep(0, 6), hartmann6, lower = rep(0,6), upper = rep(1,6),
83            nl.info = TRUE, control=list(xtol_rel=1e-8, maxeval=1000))
84## Number of Iterations....: 4050
85## Termination conditions:  maxeval: 10000	xtol_rel: 1e-06
86## Number of inequality constraints:  0
87## Number of equality constraints:    0
88## Optimal value of objective function:  -3.32236801141328
89## Optimal value of controls:
90##     0.2016893 0.1500105 0.4768738 0.2753326 0.3116516 0.6573004
91
92}
93\references{
94W. L. Price, ``Global optimization by controlled random
95search,'' J. Optim. Theory Appl. 40 (3), p. 333-348 (1983).
96
97P. Kaelo and M. M. Ali, ``Some variants of the controlled random search
98algorithm for global optimization,'' J. Optim. Theory Appl. 130 (2), 253-264
99(2006).
100}
101