A PSO-based Clustering Algorithm with Application to Unsupervised Image Classification

Get Complete Project Material File(s) Now! »

Optimization and Optimization Methods

This chapter provides a brief overview of optimization. This is followed by a brief discussion of traditional and stochastic optimization methods. Evolutionary algorithms (with more emphasis on genetic algorithms) are then presented. This is followed by an elaborated discussion of particle swarm optimization and its various modifications. A brief overview of nt colony systems is then given.

Optimization

The objective of optimization is to seek values for a set of parameters that maximize or minimize objective functions subject to certain constraints [Rardin 1998; Van den Bergh 2002]. A choice of values for the set of parameters that satisfy all constraints is called a feasible solution. Feasible solutions with objective function value(s) as good as the values of any other feasible solutions are called optimal solutions [Rardin 1998]. An example of an optimization problem is the arrangement of the transistors in a computer chip in such a way that the resulting layout occupies the smallest area and that as few as possible components are used. Optimization techniques are used on a daily base for industrial planning, resource allocation, scheduling, decision making, etc. Furthermore, optimization techniques are widely used in many fields such as business, industry, engineering and computer science. Research in the optimization field is very active and new optimization methods are being developed regularly [Chinneck 2000].
Optimization encompasses both maximization and minimization problems. Any maximization problem can be converted into a minimization problem by taking the negative of the objective function, and vice versa. Hence, the terms optimization, maximization and minimization are used interchangeably in this thesis. In general, the problems tackled in this thesis are minimization problems. Therefore, the remainder of the discussion focuses on minimization problems.
The minimization problem can be defined as follows [Pardalos et al. 2002]
Given f : S → ℜ where S ⊆ ℜNd and d N is the dimension of the search space S find x∗ ∈ S such that f (x∗ ) ≤ f (x), ∀x ∈ S (2.1)
The variable x∗ is called the global minimizer (or simply the minimizer) of f and f (x∗ ) is called the global minimum (or simply the minimum) value of f . This can be illustrated as given in Figure 2.1 where x∗ is a global minimizer of f . The process of finding the global optimal solution is known as global optimization [Gray et al. 1997].
A true global optimization algorithm will find x∗ regardless of the selected starting point x ∈ S 0 [Van den Bergh 2002]. Global optimization problems are generally very difficult and are categorized under the class of nonlinear programming (NLP) [Gray et al. 1997].
Examples of global optimization problems are [Gray et al. 1997]:
• Combinatorial problems: where a linear or nonlinear function is defined over a finite but very large set of solutions, for example, network problems and scheduling [Pardalos et al. 2002]. The problems addressed in this thesis belong to this category.
• General unconstrained problems: where a nonlinear function is defined over an unconstrained set of real values.
• General constrained problems: where a nonlinear function is defined over a constrained set of real values.
Evolutionary algorithms (discussed in Sections 2.4-2.5) have been successfully applied to the above problems to find approximate solutions [Gray et al. 1997]. More details about global optimization can be found in Pardalos et al. [2002], Floudas and Pardalos [1992] and Horst et al. [2000].

READ  The burden of hepatotoxicity on the pharmaceutical industry

Chapter 1 Introduction
1.1 Motivation
1.2 Objectives
1.3 Methodology
1.4 Contributions
1.5 Thesis Outline
Chapter 2 Optimization and Optimization Methods
2.1 Optimization
2.2 Traditional Optimization Algorithms
2.3 Stochastic Algorithms
2.4 Evolutionary Algorithms
2.5 Genetic Algorithms
2.5.1 Solution Representation
2.5.2 Fitness Function
2.5.3 Selection
2.5.4 Crossove
2.5.5 Mutation
2.5.6 The Premature Convergence Problem
2.6 Particle Swarm Optimization
2.6.1 The PSO Algorithm
2.6.2 The lbest Model
2.6.3 PSO Neighborhood topologies
2.6.4 The Binary PSO
2.6.5 PSO vs. GA.
2.6.6 PSO and Constrained Optimization
2.6.7 Drawbacks of PSO
2.6.8 Improvements to PSO .
2.7 Ant Systems
2.8 Conclusions.
Chapter 3 Problem Definiton
3.1 The Clustering Problem
3.1.1 Definitions
3.1.2 Similarity Measures .
3.1.3 Clustering Technique
3.1.4 Clustering Validation Technique
3.1.5 Determining the Number of Clusters.
3.1.6 Clustering using Self-Organizing Maps
3.1.7 Clustering using Stochastic Algorithms
3.1.8 Unsupervised Image Classification.
3.2 Image Segmentation using Clustering
3.3 Color Image Quantization
3.4 Spectral Unmixing
3.5 Conclusions
Chapter 4 A PSO-based Clustering Algorithm with Application to Unsupervised Image Classification
4.1 PSO-Based Clustering Algorithm
4.1.1 Measure of Quality
4.1.2 PSO-Based Clustering Algorithm
4.1.3 A Fast Implementation
4.2 Experimental Results
4.2.1 gbest PSO versus K-Means
4.2.2 Improved Fitness Function
4.2.3 gbest PSO versus GCPSO.
4.2.4 Influence of PSO Parameters
4.2.5 gbest PSO versus state-of-the-art clustering algorithms
4.2.6 Different Versions of PSO
4.2.7 A Non-parametric Fitness Function
4.2.8 Multispectral Imagery Data
4.2.9 PSO for Data Clustering
4.3 Conclusions
Chapter 5 SIGT: Synthetic Image Generation Tool for Clustering Algorithms
Chapter 6 Dynamic Clustering using Particle Swarm Optimization with Application to Unsupervised Image Classification
Chapter 7 Applications
Chapter 8 Conclusion 
Bibliography

GET THE COMPLETE PROJECT

Related Posts