SciPy

scipy.optimize.dual_annealing

scipy.optimize.dual_annealing(func, bounds, args=(), maxiter=1000, local_search_options={}, initial_temp=5230.0, restart_temp_ratio=2e-05, visit=2.62, accept=-5.0, maxfun=10000000.0, seed=None, no_local_search=False, callback=None, x0=None)[source]

Find the global minimum of a function using Dual Annealing.

Parameters:
func : callable

The objective function to be minimized. Must be in the form f(x, *args), where x is the argument in the form of a 1-D array and args is a tuple of any additional fixed parameters needed to completely specify the function.

bounds : sequence, shape (n, 2)

Bounds for variables. (min, max) pairs for each element in x, defining bounds for the objective function parameter.

args : tuple, optional

Any additional fixed parameters needed to completely specify the objective function.

maxiter : int, optional

The maximum number of global search iterations. Default value is 1000.

local_search_options : dict, optional

Extra keyword arguments to be passed to the local minimizer (minimize). Some important options could be: method for the minimizer method to use and args for objective function additional arguments.

initial_temp : float, optional

The initial temperature, use higher values to facilitates a wider search of the energy landscape, allowing dual_annealing to escape local minima that it is trapped in. Default value is 5230. Range is (0.01, 5.e4].

restart_temp_ratio : float, optional

During the annealing process, temperature is decreasing, when it reaches initial_temp * restart_temp_ratio, the reannealing process is triggered. Default value of the ratio is 2e-5. Range is (0, 1).

visit : float, optional

Parameter for visiting distribution. Default value is 2.62. Higher values give the visiting distribution a heavier tail, this makes the algorithm jump to a more distant region. The value range is (0, 3].

accept : float, optional

Parameter for acceptance distribution. It is used to control the probability of acceptance. The lower the acceptance parameter, the smaller the probability of acceptance. Default value is -5.0 with a range (-1e4, -5].

maxfun : int, optional

Soft limit for the number of objective function calls. If the algorithm is in the middle of a local search, this number will be exceeded, the algorithm will stop just after the local search is done. Default value is 1e7.

seed : {int or numpy.random.RandomState instance}, optional

If seed is not specified the numpy.random.RandomState singleton is used. If seed is an int, a new RandomState instance is used, seeded with seed. If seed is already a RandomState instance, then that instance is used. Specify seed for repeatable minimizations. The random numbers generated with this seed only affect the visiting distribution function and new coordinates generation.

no_local_search : bool, optional

If no_local_search is set to True, a traditional Generalized Simulated Annealing will be performed with no local search strategy applied.

callback : callable, optional

A callback function with signature callback(x, f, context), which will be called for all minima found. x and f are the coordinates and function value of the latest minimum found, and context has value in [0, 1, 2], with the following meaning:

  • 0: minimum detected in the annealing process.
  • 1: detection occured in the local search process.
  • 2: detection done in the dual annealing process.

If the callback implementation returns True, the algorithm will stop.

x0 : ndarray, shape(n,), optional

Coordinates of a single n-dimensional starting point.

Returns:
res : OptimizeResult

The optimization result represented as a OptimizeResult object. Important attributes are: x the solution array, fun the value of the function at the solution, and message which describes the cause of the termination. See OptimizeResult for a description of other attributes.

Notes

This function implements the Dual Annealing optimization. This stochastic approach derived from [3] combines the generalization of CSA (Classical Simulated Annealing) and FSA (Fast Simulated Annealing) [1] [2] coupled to a strategy for applying a local search on accepted locations [4]. An alternative implementation of this same algorithm is described in [5] and benchmarks are presented in [6]. This approach introduces an advanced method to refine the solution found by the generalized annealing process. This algorithm uses a distorted Cauchy-Lorentz visiting distribution, with its shape controlled by the parameter \(q_{v}\)

\[g_{q_{v}}(\Delta x(t)) \propto \frac{ \ \left[T_{q_{v}}(t) \right]^{-\frac{D}{3-q_{v}}}}{ \ \left[{1+(q_{v}-1)\frac{(\Delta x(t))^{2}} { \ \left[T_{q_{v}}(t)\right]^{\frac{2}{3-q_{v}}}}}\right]^{ \ \frac{1}{q_{v}-1}+\frac{D-1}{2}}}\]

Where \(t\) is the artificial time. This visiting distribution is used to generate a trial jump distance \(\Delta x(t)\) of variable \(x(t)\) under artificial temperature \(T_{q_{v}}(t)\).

From the starting point, after calling the visiting distribution function, the acceptance probability is computed as follows:

\[p_{q_{a}} = \min{\{1,\left[1-(1-q_{a}) \beta \Delta E \right]^{ \ \frac{1}{1-q_{a}}}\}}\]

Where \(q_{a}\) is a acceptance parameter. For \(q_{a}<1\), zero acceptance probability is assigned to the cases where

\[[1-(1-q_{a}) \beta \Delta E] < 0\]

The artificial temperature \(T_{q_{v}}(t)\) is decreased according to

\[T_{q_{v}}(t) = T_{q_{v}}(1) \frac{2^{q_{v}-1}-1}{\left( \ 1 + t\right)^{q_{v}-1}-1}\]

Where \(q_{v}\) is the visiting parameter.

New in version 1.2.0.

References

[1](1, 2) Tsallis C. Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics, 52, 479-487 (1998).
[2](1, 2) Tsallis C, Stariolo DA. Generalized Simulated Annealing. Physica A, 233, 395-406 (1996).
[3](1, 2) Xiang Y, Sun DY, Fan W, Gong XG. Generalized Simulated Annealing Algorithm and Its Application to the Thomson Model. Physics Letters A, 233, 216-220 (1997).
[4](1, 2) Xiang Y, Gong XG. Efficiency of Generalized Simulated Annealing. Physical Review E, 62, 4473 (2000).
[5](1, 2) Xiang Y, Gubian S, Suomela B, Hoeng J. Generalized Simulated Annealing for Efficient Global Optimization: the GenSA Package for R. The R Journal, Volume 5/1 (2013).
[6](1, 2) Mullen, K. Continuous Global Optimization in R. Journal of Statistical Software, 60(6), 1 - 45, (2014). DOI:10.18637/jss.v060.i06

Examples

The following example is a 10-dimensional problem, with many local minima. The function involved is called Rastrigin (https://en.wikipedia.org/wiki/Rastrigin_function)

>>> from scipy.optimize import dual_annealing
>>> func = lambda x: np.sum(x*x - 10*np.cos(2*np.pi*x)) + 10*np.size(x)
>>> lw = [-5.12] * 10
>>> up = [5.12] * 10
>>> ret = dual_annealing(func, bounds=list(zip(lw, up)), seed=1234)
>>> print("global minimum: xmin = {0}, f(xmin) = {1:.6f}".format(
...       ret.x, ret.fun))
global minimum: xmin = [-4.26437714e-09 -3.91699361e-09 -1.86149218e-09 -3.97165720e-09
 -6.29151648e-09 -6.53145322e-09 -3.93616815e-09 -6.55623025e-09
-6.05775280e-09 -5.00668935e-09], f(xmin) = 0.000000

Previous topic

scipy.optimize.shgo

Next topic

scipy.optimize.least_squares