Optimal and approximated restorations in Gaussian linear Markov switching models

Get Complete Project Material File(s) Now! »

Restoration of simulated stationary data

We present two experiments here to verify the property of CGOMSM-R (Series 1) and show the efficiency of both the exact forward and backward restorations of CGOMSM when approximating the CGPMSM (Series 2). All results presented here are averages of 100 independent experiments. The abbreviations of methods used in the following experiments are
1. Opt-F: Optimal forward restoration knowing the true switches.
2. Opt-B: Optimal backward restoration knowing the true switches.
3. CGO-F: Exact forward CGOMSM restoration with unknown switches.
4. CGO-B: Exact backward CGOMSM restoration with unknown switches.n Each method includes filtering and smoothing.

EM-based parameter estimation of stationary CGPMSM

So far, we consider only the restorations of CGPMSM assuming that all the parameters are known. From this Section, we are going to cope with the unsupervised restoration without knowledge of parameters. The primary problem we need to solve under unsupervised case, is the parameter estimation problem. We are going to deal the parameter estimation problem by using the classic EM principle, since under Gaussian linear case, the derivatives are computable in M-Step and EM shows more stability after converging comparing to ICE, see Figure 1.2a. However, when applying EM principle, the exact computation of E given by (2.9) is not possible under the general CGPMSM, which brings the out come that eithe (1 and 2 are the two equivalent parameter sets of CGPMSM defined in Section 2.1.3) in the E-step of the EM iterations can not be computed in a reasonable time.The main contribution of this Section is to propose a general estimation method.
Based on applying EM principle twice, this method allows one to estimate all model parameters2 from all observation YN 1 only, with a certain number of possible switches K. The estimated parameters could be used for smoothing which results in unsupervised restoration. Firstly, we notice that if RN
1 can be estimated, CGPMSM will degenerated
2Mxj is always assumed to be known since it can not be recovered.

EM estimation for CGPMSM with known switches

Knowing 2 (the means) and the switches, a CGPMSM is actually a GPMM with switching parameters. In this Section, we extend the constant parameter GPMMbased EM algorithm [5] to the switching parameter case that we are dealing with here. We call this extension “Switching EM”.
Under the assumption that rN
1 is known. For the convenience of likelihood expression, let z be the parameter set of the likelihood, which is actually constituted by the parameter sets defined in Section 2.1.3. The function to update

Overall double-EM algorithm

So far, we have explained the Step A and Step B, and actually these two steps are already enough to estimate all the parameters. However, if we consider an improvement of initialization in Step A, the entire Double EM is then constructed by applying Step A, Step B sequentially and a feedback Step C, which can update the initialization of the parameters for Step A, so that we can iterate these three steps several loops to get better estimation.
In detail, what the feedback Step C does, is to return ^ 1, ^ 2 given by Step A, and the variance–covariance of p (y1; y2 jr1 = j; r2 = k ) extracted from ^ 3 given by Step B, to be the initialization of the EM for discrete state-space PMC in next loop’s Step A to replace the K-means initialization, which may cause failure.
The entire Double EM parameter estimation algorithm is summarized in Algorithm

READ  A probabilistic numerical method for optimal multiple switching problem in high dimension 

Table of contents :

Abstract
Résumé
Introduction
1 Pairwise Markov chain and basic methods 
1.1 Different dependences in PMC
1.2 PMC with discrete finite state-space
1.2.1 Optimal restoration
1.2.2 Unsupervised restoration
1.2.2.1 EM for Gaussian stationary case
1.2.2.2 ICE for stationary case
1.2.2.3 Principles for infering hidden states
1.3 PMC with continuous state-space
1.3.1 Restoration of continuous state-space PMC
1.4 Conclusion
2 Optimal and approximated restorations in Gaussian linear Markov switching models 
2.1 Filtering and smoothing
2.1.1 Definition of CGPMSM and CGOMSM
2.1.2 Optimal restoration in CGOMSM
2.1.3 Parameterization of stationary models
2.1.3.1 Reversible CGOMSM
2.1.4 Restoration of simulated stationary data
2.2 EM-based parameter estimation of stationary CGPMSM
2.2.1 EM estimation for CGPMSM with known switches
2.2.2 Overall double-EM algorithm Contents
2.2.3 Discussion about special failure case of double-EM algorithm 45
2.3 Unsupervised restoration in CGPMSM
2.3.1 Two restoration approaches in CGPMSM
2.3.1.1 Approximation based on parameter modification
2.3.1.2 Approximation based on EM
2.3.2 Double EM based unsupervised restorations
2.3.2.1 Experiment on varying switching observation means 59
2.3.2.2 Experiment on varying noise levels
2.4 Conclusion
3 Non-Gaussian Markov switching model with copulas 
3.1 Generalization of conditionally observed Markov switching model
3.1.1 Definition of GCOMSM
3.1.2 Model simulation
3.2 Optimal restoration in GCOMSM
3.2.1 Optimal filtering in GCOMSM
3.2.2 Optimal smoothing in GCOMSM
3.2.3 Examples of GCOMSM and the optimal restoration in them 78
3.2.3.1 Example 1 – Gaussian linear case
3.2.3.2 Example 2 – non-Gaussian non-linear case
3.3 Model identification
3.3.1 Generalized iterative conditional estimation
3.3.2 Least-square parameter estimation for non-linear switching model
3.3.3 The overall GICE-LS identification algorithm
3.4 Performance and application of the GICE-LS identification algorithm 92
3.4.1 Performance on simulated GCOMSM data
3.4.1.1 Gaussian linear case
3.4.1.2 Non-Gaussian non-linear case
3.4.2 Application of GICE-LS to non-Gaussian non-linear models . 104
3.4.2.1 On stochastic volatility data Contents
3.4.2.2 On Kitagawa data
3.5 Conclusion
4 Conclusion and perspectives
A Maximization of the likelihood function in Switching EM
B Particle filter for CGPMSM
B.1 Particle Filter
B.1.1 Sequential Importance Sampling
B.1.2 Importance distribution and weight
B.1.3 Sampling importance resampling (SIR)
B.2 Particle Smoother
C Margins and copulas used in this dissertation
D Publications
Bibliography

GET THE COMPLETE PROJECT

Related Posts