2 Local Algorithms:

      2.1 Simulated annealing :

Simulated annealing is an optimization algorithm inspired by the metal annealing process, where a metal is heated and then cooled gradually to achieve a stable structure. In computer science, it is commonly used to solve complex optimization problems, such as the Business Traveler (TSP) problem, with the objective of minimizing a cost function.

    The main idea of this algorithm is to avoid local minimums, sub-optimal solutions, b y        temporarily accepting solutions of lower quality. The likelihood of accepting these              solutions decreases over time, allowing for a broader initial exploration and refinement  to an optimal solution.

2.2 Steps of the simulated annealing algorithm:

1. Initialization:

       Choose an initial solution, noted s.

       Set initial temperature T.

       Define the alpha cooling rate (e.g. 0.99), number of iterations per temperature, and the shutdown criterion (for example when the temperature is near zero or after a defined number of iterations).

2. Main loop :

As long as the temperature T is above the minimum threshold:

1. For a given number of iterations:

- Generate a new S' solution by making a small change to the current S  solution.

- Calculate the difference in cost between the new S' solution and the current S solution:

- If S' is a better solution (if ΔE<0), accept S’ as the new common solution.

- Otherwise, accept S' with a probability   .  This probability decreases  as the temperature drops.

2. Reduce temperature as follows: T = alpha * T.

       3. End of algorithm:

       When the temperature is very low or the shutdown criterion is reached, the current S solution is considered as the best solution found.

2.3 Parameters to be adjusted:

       Initial temperature T_initial: Must be high enough to allow initial exploration of the solution space.

       Alpha cooling rate: Controls the cooling rate, usually between 0.8 and 0.99.

       Number of iterations per temperature: Determines how many neighbors are tested before lowering the temperature.

           2.4 Application and Benefits

Simulated annealing is particularly useful in situations where the solution space is very large and contains many local minima. Although simulated annealing does not always guarantee the optimal solution, it is often able to find a satisfactory solution in a reasonable time for complex problems.

          3. The Hill Climbing:

The Hill Climbing algorithm is a simple optimization method, used to find the maximum or minimum of a cost function by making successive improvements. It is a "greedy" algorithm that progresses towards the solution by choosing at each step the option that seems to be the best immediately. However, it is easy to get stuck in a local optimum without ensuring that the overall optimal solution is achieved.

          3.1  Principle of the Hill Climbing algorithm:

  1. Initialization:

       Choose a random initial solution.

       Calculate the value of cost function f(s) for this solution.

          2. Search loop:

         As long as there is room for improvement:

        ·        Generate the neighbors of the current solution S.

        ·       Select the best neighbor S' among these, based on the value of the cost function.

        ·       If S' is better than S (i.e. if f(S’)<f(S), to minimize), update the current solution S = S’.

        ·       Otherwise, stop the algorithm (we have reached a local maximum or minimum).

3.2 Important parameters:

  1. Initial solution: Hill Climbing performance is highly dependent on the starting solution, as this influences the local optimum achieved.
  2. Cost function: Hill Climbing uses only the value of the cost function to compare solutions.
  3. Neighbors: The choice of method to generate neighbors is crucial. Too few neighbors can limit the search, while too many can increase the computation time.

  4.     3.3   Benefits and Limitations:

           Benefits:

    1. Simple to implement.
    2. Fast for simple problems where local optima are rare.

           Limitations :

    1. Prone to getting stuck in local optima, without guarantee of finding the overall optimum.

    Example of use:

    Hill Climbing is useful for problems where fast computation and local optimization are the priority, such as parameter adjustment, simple path searches, or system configurations. However, for complex problems (e.g. TSP), more advanced algorithms such as simulated annealing or genetic algorithms may be more effective.

Modifié le: dimanche 10 novembre 2024, 21:28