[Algorithm] Simulated Annealing Algorithm (SAA, Simulated Annealing Algorithm)

Introduction to simulated annealing algorithm (SAA)

The Simulated Annealing Algorithm (SAA) is inspired by the annealing process in the process casting process. As the casting temperature increases, the molecular motion tends to be disordered. After slow cooling, the molecular motion tends to be orderly. This is an algorithm based on the probabilistic jump property (depending on the temperature state), which probabilistically searches for (or approaches) the optimal solution in the solution space. It is a common solution to jump out of the local optimal solution. It has important and widespread applications in Very Large Scale Integration Circuit (VLSI), neural network computers, computer vision, TSP and Knapsack problems, etc.

The simulated annealing algorithm adopts a serial structure.

SAA steps

The three elements of the simulated annealing algorithm are the solution space, the objective function and the initial solution. Its basic idea is:

  1. initialization
    First determine an initial temperature T, initial solution state S, and the number of iterations L for each T value.
  2. Iterate
    Do these operations in 1-L times: (2)
    Generate a new solution S’.
    Calculate the temperature increment Δt’=C(S’)-C(S). (C(S) is the evaluation function)
    If Δt′<0, accept S′ as the new current solution, otherwise accept S′ as the new current solution with probability exp(-Δt′/(KT)) (k is Boltzmann’s constant)
    If the termination condition has been met, the current solution is accepted as the optimal solution and the iterative calculation stops. If not, continue.
    T gradually decreases and t approaches 0, then go to 2. iteration. (1)
    ?

SAA Principle

The simulated annealing algorithm consists of two parts, namely Metropolis and annealing process, which correspond to the inner loop and the outer loop respectively. The external circulation heats the solid to a higher temperature (initial temperature T0), and then decreases the temperature in a certain proportion according to the cooling coefficient alpha. When the end temperature Tf is reached, the cooling ends and the annealing process ends, corresponding to the above (1) process. .

The Metropolis algorithm is an inner loop, which iterates L times at each updated temperature and finds the minimum value of energy (optimal solution) at that temperature. Corresponding to the above-mentioned (2) process.

Use energy changes to explain the situation where the simulated annealing algorithm accepts a new solution:

Metropolis Guidelines:
Proposed in 1953. If state i changes to state j at temperature T, then if Ej The Metropolis algorithm is the basis of the annealing algorithm, and it also selects the global optimum from the local optimum.

During the “annealing” process, the casting slowly cools, the molecular motion tends to be orderly, and each molecule tends to an equilibrium state. When the temperature finally drops to normal temperature, it reaches the ground state, the internal energy is reduced to a minimum, and the atoms with reduced energy tend to be stable. .

at a certain temperature

T

0

T_0

Under T0?, there is a state change (change from one wave trough to the next wave trough), then if the next wave trough energy

E

(

n

+

1

)

E(n + 1)

E(n + 1) is greater than the original

E

(

n

)

E(n)

E(n), accept it as the new solution, otherwise enter judgment and set a probability P:

P

=

{

1

E

(

n

+

1

)

< E ( n ) e ? ( E ( n + 1 ) ? E ( n ) ) / T E(n + 1)≥E(n) P= \begin{cases} \displaystyle1 & amp; E(n + 1)1e?(E(n + 1)?E(n))/T?E(n + 1)

The annealing process is controlled by the cooling temperature table (Cooling Schedule), including the initial value t of the control parameter and its attenuation factor Δt, the number of iterations L at each t value, and the stop condition Tf.


Three principles of acceptance:

(1) At a fixed temperature, the probability of accepting a candidate solution that decreases the objective function

>

>

>The probability of candidate solutions that increases the objective function;

(2) As the temperature decreases, the probability of accepting a solution that increases the objective function gradually decreases;

(3) When the temperature approaches zero, only solutions in which the objective function decreases can be accepted.

It can be seen that the selection and control of parameters constitute the focus and innovation of the simulated annealing algorithm.

SAA parameter selection control

  1. Choose a high enough initial temperature

    T

    0

    T_0

    T0? means that there are more options that can be touched, and it is closer to the global optimal solution.

  2. The temperature drop method of Exponential drop is particularly convenient and has a slow convergence speed.
  3. Selection of termination temperature.

Directions for improvement of SAA

Disadvantages of simulated annealing algorithm:

  1. Slow convergence
  2. There is a certain contradiction between the optimization solution and the running time
  3. It is difficult to determine whether the final equilibrium state has been reached

At present, there are some efficient improvement solutions, such as parallel simulated annealing algorithm, fast simulated annealing algorithm, etc.

Possible solutions for improvement:
Adopt a parallel search structure, improve the state generation function, improve the annealing process, and add new links, such as adding a heating/reheating process, a supplementary search process, and multiple search strategies.

SAA simple application example

Solve using simulated annealing algorithm

f

(

x

)

=

x

2

f(x)=x^2

The minimum value of f(x)=x2:

import math
import random
  
def simulated_annealing():
    # initialization
    x = 20*(random.random()-0.5) #Initial solution
    T = 1000 # initial temperature
    T_min = 1e-3 # Temperature lower limit, when T is less than this value, stop iteration
    alpha = 0.99 # Temperature attenuation coefficient
    y = x**2 # Quality of initial solution
  
    while T > T_min:
        # Randomly select a new solution near the current solution
        delta_x = (random.random()-0.5)
        new_x = x + delta_x
        new_y = new_x**2
  
        # Metropolis criterion: If the new solution is better than the old solution, or accept a bad solution based on probability to avoid falling into a local minimum
        if new_y < y:
            x = new_x
            y = new_y
        else:
            p = math.exp((y - new_y) / T)
            if random.random() < p:
                x = new_x
                y = new_y
  
        #cooldown
        T *= alpha
  
    return x, y
  
if __name__ == "__main__":
    x, y = simulated_annealing()
    print(f"Optimal solution: x = {<!-- -->x}, y = {<!-- -->y}")