Random Particle Approach (RPSO)

Get Complete Project Material File(s) Now! »

Methodology

The theoretical models developed in this thesis are used to characterise the behaviour of all the newly introduced algorithms. Each new algorithm is theoretically analysed to show whether it is guaranteed to converge on either a local or global minimum, depending on whether the algorithm is a local or global search algorithm, respectively.
Empirical results were obtained using various synthetic benchmark functions with well-known characteristics. These results are used as supporting evidence for the theoretical convergence characteristics of the various algorithms. Owing to the stochastic nature of all these algorithms, it is not always possible to directly observe the characteristics predicted by the theoretical model, i.e. a stochastic global optimisation algorithm may require an infinite number of iterations to guarantee that it will find the global minimiser.
Therefore the probability of observing this algorithm locate a global minimiser in a finite number of iterations is very small. Despite this problem, it is still possible to see whether the algorithm is still making progress toward its goal, or whether it has become trapped in a local minimum.
The results of two Genetic Algorithm-based optimisation techniques are also reported for the same synthetic benchmark functions. These results provide some idea of the relative performance of the PSO-based techniques when compared to other stochastic, population-based algorithms.
A second set of experiments were performed on a real-world problem to act as a control for the results obtained on the synthetic functions. The task of training both summation and product unit neural networks was selected as an example of a real-world optimisation problem. On these problems the results of the PSO-based algorithms were compared to that of the GA-based algorithms, as well as that of two efficient gradientbased algorithms.

READ  Human Immunodeficiency Virus (HIV) and Acquired Immune Deficiency Syndrome (AIDS)

1 Introduction 
1.1 Motivation
1.2 Objectives
1.3 Methodology
1.4 Contributions
1.5 Thesis Outline
2 Background 
2.1 Optimisation
2.1.1 Local Optimisation
2.1.2 Global Optimisation
2.1.3 No Free Lunch Theorem
2.2 Evolutionary Computation
2.2.1 Evolutionary Algorithms
2.2.2 Evolutionary Programming (EP)
2.2.3 Evolution Strategies (ES) .
2.3 Genetic Algorithms (GAs)
2.4 Particle Swarm Optimisers
2.4.1 The PSO Algorithm
2.4.2 Social Behaviour
2.4.3 Taxonomic Designation
2.4.4 Origins and Terminology .
2.4.5 Gbest Model .
2.4.6 Lbest Model
2.5 Modifications to the PSO
2.5.1 The Binary PSO
2.5.3 Increased Diversity Improvements
2.5.4 Global Methods .
2.5.5 Dynamic Objective Functions .
2.6 Applications .
2.7 Analysis of PSO Behaviour .
2.8 Coevolution, Cooperation and Symbiosis
2.9 Important Issues Arising in Coevolution .
2.10 Related Work .
3 PSO Convergence
3.1 Analysis of Particle Trajectories
3.2 Modified Particle Swarm Optimiser (GCPSO)
3.3 Convergence Proof for the PSO Algorithm
3.4 Stochastic Global PSOs .
3.4.1 Non-Global PSOs .
3.4.2 Random Particle Approach (RPSO)
3.4.3 Multi-start Approach (MPSO)
3.4.4 Rate of Convergence
3.4.5 Stopping Criteria
4 Models for Cooperative PSOs 
4.1 Models for Cooperation .
4.2 Cooperative Particle Swarm Optimisers .
4.3 Hybrid Cooperative Particle Swarm Optimisers
4.4 Conclusion .
5 Empirical Analysis of PSO Characteristics 
5.1 Methodology .
5.2 Convergence Speed versus Optimality .
5.3 GCPSO Performance .
5.4 Global PSO Performance .
5.5 Cooperative PSO Performance .
5.6 Conclusion .
6 Neural Network Training 
7 Conclusion 
A Glossary
B Definition of Symbols
C Derivation of Explicit PSO Equations
D Function Landscapes

GET THE COMPLETE PROJECT

Related Posts