Estimated Optical flow vectors for each pyramids’ level

Get Complete Project Material File(s) Now! »

Background, Definitions and Related works

In this chapter, we provide the reader with some material on the domain of our thesis work. A description of the methods, algorithms and theoretical features that one involved in this domain will give the reader a better understanding of the current work.

Optical flow and Motion

Optical flow is the pattern of apparent 2D motion of objects in sequences of time‐ordered images. Two dimensional image motions in an image plane is the projection three dimensional motion of object R → R and each image consist of many pixels that all have unique coordinates so they can be describe as a 2D vector, in every pixel of a video.
In time‐ordered images that can be video frames, these vectors show motion of that pixel from image 1 to image 2 or video‐frame 1 to video frame 2.
As it was mentioned above, image points (pixels) travel from one frame to another, which indicates optical flow. By definition, the apparent motion of the brightness pattern calls optical flow. In this point, it is important that motion field is not equal to optical flow because if we consider uniform sphere with existence of some shadow on the surface then if sphere rotates, shading pattern will not move at all so in this example motion field is not zero but the optical flow is zero because the processed frames are identical.
If shading pattern changes by moving the lighting source and sphere keeps fix then the motion field is zero but optical flow is not zero.
These example shows that these two subjects “optical flow” and “motion field” are not equal but one can assume that for most natural scenes they should be highly correlated.
Figure 5: The barber pole illusion
The barber pole illusion has been shown in figure 5. In fact, it is a visual illusion that shows biases in the estimating of visual motion in the observer’s brain. When it rotates right or left in such a way that has been shown in figure 5 (that top and end are hidden), the observer perceives that stripes are moving up or down (in the direction of its vertical axis).
This illusion happens because contour provides ambiguous information about its true direction of movement. This situation is referred to the aperture problem that is discusses in details in next section.

Optical flow calculation and Aperture problem

We assume an image F(x, y, t) representing spatio‐temporal image of moving particles, and velocity of an image pixel V v , v moving in the (x, y): v , v , v (2.1) Where s=(x, y is the coordinate of a particle and t represents the time coordinates.
If we assume that the intensity of S is the same during dt, we reach equations [1.1] and [1.2] respectively and then: v . F 0 (2.2) where F , is image spatial intensity gradient at pixel S and this equation is called the optical flow constraint equation and it defines a single local constrain on image motion.
But we need to find two real variables v and v with having only one constraint equation which means that we are not able to determine optical flow with only one observation.
In figure 6 any point on the constraint line can be optical flow of given image pixel. Normal velocity is defined as perpendicular vector to the constraint line and it is smallest magnitude on the optical flow constraint line that is only velocity vector that can be estimated in the direction of local gradient of image intensity function.
As discussed in chapter1, the cannot determine those flows gradient.
In aperture 3, based on explanation above, only a horizontal motion (normal motion) that is orthogonal to the square’s border can be estimated; whereas in aperture 2 that shows the corner point, there is enough information(both vertical and horizontal square’s edges) for estimating the both motion.

Optical flow methods

There are some different techniques for solving this problem that all of them try to introduce some additional condition or constraint for estimating actual flow.
Classifications of these techniques based on Beauchemin and Barron’s studies [11] are as bellows:
1. Intensity based differential methods
2. Frequency based methods
3. Correlation based methods
4. Multiple motion methods
5. Multiconstraint methods
6. Temporal refinement methods
Methods that are using the equation (1.1) are referred to as differential techniques.
In differential techniques image velocity is computed by spatial‐temporal derivatives of image intensity. Therefore, it treats the image sequence as a continuous (differentiable) function in time and space domains. Equation (2.2) is a basic formula for computing optical flow in global and local first and second order methods. Except equation (2.2) global methods use additional global constrains and also smoothness regularization term to estimate dense optical flow for the large image regions. Normal velocity information in all of the local neighborhoods is used in Local methods to perform a minimization to find the best fit for .
A contour or surface model can also be used to integrate normal velocities (that we have) into full velocity and discontinuous optical flow can be analyzed by parametric models, line process or mixed velocity distribution. These techniques which mentioned above do segmentation of optical flow into the region that is corresponding to different independently moving surfaces or objects.
In this thesis we have chosen differential methods for estimating optical flow but differential methods also have some different subsets:
Global methods
Local models
Surface models
Contour models
Multiconstraint methods
Two well‐known differential methods are the Lucas‐Kanade and Horn‐Schunck techniques.
Local techniques for e.g. the Lucas and Kanade method [12] involve the optimization of a local energy functional or the frequency based minimization methods which can be found in [13] and [14].
The global category as in Horn and Schunck [15] refers to methods that determine optical flow by minimizing of a global energy functional.
Differential techniques are used widely because of their high level of performance [16].
Each one of global or local methods has their own advantages and disadvantages. Global techniques provide dense flow fields with the power to analyze « no structure » regions, but as a disadvantage they have a much larger sensitivity to noise [16] and, on the other hand, local methods offer robustness to noise.
There is also an approach by N.Bauer, P.Pathirana and P.Hodgson that involves developing neighborhood selection for combined Horn‐Schunck / Lucas‐Kanade robust optical flow [17].
Now we will explain two well‐known differential techniques and we will also clarify the techniques that we have chosen to do our thesis after them.

READ  Integration of full Safety and Security Features into Autonomous Vehicle Model

Local model and Lucas and Kanade method

One of the classical approaches for estimation of the optical flow was suggested by Lucas and Kanade (LK). Originally, they did not formulate an analysis of spatio‐temporal volumes, but rather considered template matching/registration between two images [12].
In this model we assume that we can use a constant model for estimation of optical flow in small window Ω that Ω is spatial neighborhood, then we define window function w(s) ≻0 ∈ Ω .
So the solution is optical flow for the image pixel “s” and we should consider that the reliability of the estimation of v is exposed by the eigenvalues of matrix S . Let us assume λ and λ are eigenvalues so if both of them be large then the flow can be determined uniquely. If λ is zero, but λ is large, then it is a linearly symmetry case and only motion of lines can be determined. If both λ λ =0, then no motion can be inferred.
The LK algorithm needs regularization (for e.g. Tikhonov regularization) to become stable in the outside of any region except that of point motion, but this adds the difficulty of choosing the regularization parameter.
Since Lucas‐Kanade’s technique employs a local window to determine the optical flow of a specific image point, this is the cause that it is called a local method. Indeed in Lucas‐Kanade’s technique, flow of points calculated by finding the intersection of all the flow constraint lines (fig.6) that are corresponding to the image pixels which are in the window of “w”. Those lines will have an intersection, since Lucas‐Kanade’s technique assume that flow in the window is constant.

Table of contents :

1 Introduction
1.1 Problem Statement (Motivation)
1.2 Approach Chosen to Solve the Problem
1.3 Limitations
1.4 Thesis Goals and Contribution
2 Background, Definitions and Related works
2.1 Optical flow and Motion
2.2 Optical flow calculation and Aperture problem
2.3 Optical flow methods
2.4 Gaussian Pyramid of Gradients
3 Estimation of optical flow vectors
3.1 Estimated Optical flow vectors for each pyramids’ level
3.2 MSE (Mean Squared Error)
3.3 Comparing of Error Vectors in Different Pyramid Levels
4 Optical flow algorithm’s results
4.1 Capturing and Collecting best optical flow vectors
5 Event detection
5.1 Making grayscale and binary image
5.2 Segmentation and Moment calculation
5.3 Results on Events Detection
5.4 Conclusions and suggestions for future works
Appendix
1 Developing of the “PL” algorithm
1.1 Starting with making video samples
1.2 How to set input parameters
2 Calculation of error vectors (displacement vector) for each pixel
2.1 Estimated Optical flow Vectors (Red) and Calculated flow motions (Blue) with desired speed
2.2 Calculation of MSE (Mean Squared Error).
References

GET THE COMPLETE PROJECT

Related Posts