Get Complete Project Material File(s) Now! »

## Heuristics

Optimization techniques can be classied, in a heuristic, exact and approximation methods. The heuristic methods try to nd optimal solutions or near-optimal solutions in a signicantly reduced amount of time. The heuristic methods categorized into constructive methods and local search methods. Constructive algorithms obtain solutions from scratch by adding solution components to an initially empty list, until reaching the nal solution.

Local search algorithms start from an initial solution and iteratively replace the current solution by a better candidate from the neighbors of the current solution [22]. As dened in [18], a heuristic technique is a method, which tries to nd good solutions at a reasonable computation cost without being able to guarantee optimality. Unfortunately, it may not even be possible to determine how close to the optimal solution a particular heuristic solution is.

### Cutting plane methods

Many combinatorial optimization problems can be formulated as mixed integer linear programming problems. Then, they can be solved by branch-and-cut methods, which are exact algorithms consisting of a combination of branch-and-bound algorithm with a cutting plane method. These methods work by solving a sequence of linear programming relaxations of the integer programming problem. Cutting plane methods improve the relaxation of the problem to more closely integer programming problem and branchand- bound algorithms carry out by a sophisticated divide and conquer approach to solve problems. Cutting plane algorithms for general integer programming problems were rst proposed by Gomory [27]. Thus, this method sometimes called « Gomory Cut », who proved that these algorithms terminate after a nite number of iterations with an optimum solution.

Now, let P be a combinatorial optimization problem, E its basic set, w(:) the weight function, and S the set of feasible solutions. The problem P consists in nding an element of S whose weight is maximum/minimum. If F E, then the 0 1 vector xF 2 RE such that xF (e) = 1 if e 2 F and xF (e) = 0 otherwise, is called the incidence vector of F. The polyhedron P(S) = convxSjS 2 S is the polyhedron of the solutions of P or polyhedron associated with P. P is thus equivalent to the linear program maxfcxjx 2 P(S)g. Notice that the polyhedron P(S) can be described by a set of a facet dening inequalities. And, when all the inequalities of this set are known, then solving P is equivalent to solve a linear program. The objective of the polyhedral approach for combinatorial optimization problems is to reduce the resolution of P to that of a linear program. In order to reduce P we need a deep investigation of the polyhedron associated with P. It is generally not easy to characterize the polyhedron of a combinatorial optimization problem by a system of linear inequalities. In particular, when the problem is NP-hard it is dicult to nd such a characterization. Moreover, the number of inequalities describing this polyhedron is exponential in most of time. Therefore, even if we know the complete description of that polyhedron, its resolution remains in practice a hard task because of the large number of inequalities.

#### Modeling the problem using Genetic Algorithm

In this section, we present the modeling of our GAs for directed ascending graphs (DAGs) in cloud environment. These scheduling algorithms eectively addresses the issues of minimizing the makespan.

**Task Scheduling Genetic Algorithm (GATS)**

This GA has been proposed on heterogeneous computing systems by Zhou et al in [50]. They call it, task scheduling based Genetic Algorithm (GATS). It has been modeled as follows : The linear order of all jobs forms the chromosome. Each chromosome represents a solution for the problem by scheduling the jobs in the order given by the permutation, the order of the jobs should be a valid topological order as the associated nodes in the DAG, where start nodes should be placed in the chromosome at the beginning position , while the last nodes should be placed at the end. The initial population is produced by making a random perturbation to the order of jobs in the rst chromosome to produce a valid chromosome, until the desired size of the initial population reached. Indeed, a linear crossover from a single random position applied to the two selected parents. The mutation operation operated for all individuals of the new population considering the precedence constraints topologically. Then, the objective function is evaluated by using the Earliest Completion Time (ECT) technique, which schedules a candidate job onto machines (processors) on which the completion time of the job is the earliest. The robust characteristic in this GA is the generation of a valid chromosome in the initial population. At the next generation, we modify GATS in GATS+ by just making a random mutation for two genes selected randomly and if the candidate chromosome is not valid, then we throw it out by assigning a big value as Cmax to this candidate. Since we have valid chromosomes in the initial population, the robust characteristics of the GATS can still be maintained and we will not spend a lot of computation time in the mutation operator. This small change increases the chance to nd a best result, especially when the computation time is less than one minute, because GATS spends a lot of time in mutation procedure if the candidate is not valid. Table 2.3 shows the results obtained by GATS and GATS+ in one second with Psize = 100, Pc = 1:0 and Pm = 0:5. The dashed results means that GATS does not nd a solution during one second and also when we run the instances for 10 seconds GATS cannot nd a solution with the problems of large number of instances in all of the three density sets.

**Genetic Algorithm Based on Cut-point (GACP)**

For this genetic algorithm (GACP), the chromosome coding composed of two rows : the rst represents a valid order of jobs according to the precedence constraints, and the second row gives an information on job positioning according to the cut-point. We generate m 1 random cut-points (cp) = fcp1; cp2; :::; cpm1; g to assign jobs to its VMs.

The solution provided as follows : The sequence of jobs from j0 to jcp1 will be assigned on VM1 and the sequence of jobs from jcp1 + 1 to jcp2 will be assigned on VM2 and the sequence of jobs from to on VM3 and so on. In other words, we assign a valid sub-sequence of a random length of jobs on a specic VM. In this genetic algorithm we carried out one point crossover between two parents and an exchange between two random points carried as mutation operator. However, this genetic algorithm gave bad results. The best result obtained by GACP is at least two times the Cmax obtained by GATS.

**Table of contents :**

**Chapitre 1 State-of-the-Art**

1.1 Cloud Computing

1.2 Scheduling problems

1.3 Computational complexity

1.4 Heuristics and meta-heuristics

1.4.1 Heuristics

1.4.2 Metaheuristics

Genetic algorithm

1.5 Graph theory

1.6 Optimization problems

1.6.1 Combinatorial optimization

1.6.2 Linear programming

1.6.3 Integer programming

1.7 Polyhedral approach

1.7.1 Elements of polyhedral theory

1.7.2 Cutting plane methods

1.8 Branch and cut algorithm

**Chapitre 2 Heuristics and Meta-heuristics Solutions **

2.1 Introduction

2.1.1 Literature Review

2.2 Problem formulation

2.3 An existing algorithm

2.4 Genetic algorithm (GA)

2.4.1 Modeling the problem using Genetic Algorithm

Task Scheduling Genetic Algorithm (GATS)

Genetic Algorithm Based on Cut-point (GACP)

Genetic Algorithm Based on The List of Available Jobs (GAAV)

Genetic Algorithm (GAAV +)

Experimental Results

Integral Linear Programming Solution (ILP)

Transformations Between Genetic Algorithms

Conclusion

**Chapitre 3 Mathematical Formulations **

3.1 Introduction

3.2 Problem Description

3.3 Mathematical Formulations

3.3.1 Classical Formulation

3.3.2 Flow Formulation

3.3.3 Order Formulation

3.3.4 Interval Graph Formulation

3.4 Valid Inequalities

3.4.1 Separation Algorithm for SPT Inequality

3.4.2 Reformulation of Interval Graph Formulation

3.5 Experimental Results

3.5.1 Conclusion

**Chapitre 4 Polyedral study on interval graphs under m-clique free constraints **

4.1 Introduction

4.2 The polytopes of interval sub-graphs

4.2.1 Forbidden subgraphs inequalities

Bipartite Claw

Umbrella Inequalities

n-net Inequalities

n-tent Inequalities

Hole inequalities

Clique inequalities

Clique-Hole inequalities

4.3 Cutting plane algorithms

4.3.1 Bipartite claw separation

Exact Separation (ExBC-Sep)

Heuristic1 : Separation (H1BC-Sep)

Heuristic 2 : Separation (H2BC-Sep)

4.3.2 Umbrella separation

Exact separation algorithm

H1U-Sep separation

H2U-Sep Separation

n-net separation

n-tent separation

4.3.3 Hole separation

4.3.4 Clique separation

4.3.5 Lazy constraint approach

4.4 Application to URPMDC problem

4.4.1 Mathematical formulation

4.4.2 Computational Results

4.5 Conclusion

**Chapitre 5 Generalized Open Shop, and Open Shop Problems **

5.1 Introduction

5.2 Generalized open shop problem with jobs disjunctive constraints

5.2.1 Integer linear programming formulation

5.2.2 Valid inequalities

Sequence inequalities

Previous job inequalities

Line job inequalities

Logical implication inequalities

5.2.3 Experimental results

5.3 Open shop problem

5.3.1 Integer linear programming formulation

5.3.2 Valid inequalities

Sequence inequalities

Previous operations inequalities

Logical implication inequalities

5.3.3 Experimental results

5.4 Conclusion

**Chapitre 6 General Conclusion and Perspectives **

**Bibliographie **