M-estimators of the scatter matrix for stationary and non-stationary elliptical distributions 

Get Complete Project Material File(s) Now! »

Dual representations of the divergence under L-moment constraints

The minimization of ϕ-divergences under linear equality constraint is performed using Fenchel-Legendre duality. It transforms the constrained problems into an unconstrained one in the space of Lagrangian parameters. Let ψ denote the Fenchel-Legendre transform of ϕ, namely, for any t ∈ R ψ(t) := sup {tx − ϕ(x)} . x∈R Let us recall that dom(ϕ) = (aϕ, bϕ). We can now present a general duality result for the two optimization problems that transform a constrained problem (possibly in an infinite dimensional space) into an unconstrained one in Rl. Let C : Ω → Rl and a ∈ Rl. Denote Z LC,a = g : Ω → R s.t. g(t)C(t)µ(dt) = a .
Proposition 1.4. Let µ be a σ-finite measure on Ω ⊂ R. Let C : Ω → Rl be an array of functions such that Z kC(t)kµ(dt) < ∞. Ω If there exists some g in LC,a such that aϕ < g < bϕ µ-a.s. then the duality gap is zero i.e. g∈LC,a Z Ω ϕ ( g ) dµ = ξ∈Rl h ξ, a i − ZΩ h i (1.24) inf sup ψ( ξ, C(x) )µ(dx). Moreover, if ψ is differentiable, if µ is positive and if there exists a solution ξ∗ of the dual problem which is an interior point of Z ξ ∈ Rl s.t. ψ(hξ, C(x)i)µ(dx) < ∞ , Ω.

General definition of multivariate L-moments

Let X be a random vector in Rd. We wish to exploit the representation given by the equation (2.2) in order to define multivariate L-moments. Recall that we chose quantiles as mappings between [0; 1]d and Rd.
We explicit a polynomial orthogonal basis on [0; 1]d. Let α = (i 1 , …, i ) ∈ N d be a multi-index and Lα(t1, …, td) = d (tk) (where the d k=1 Lik Lik ’s are univariate Legendre Q polynomials defined by equation 2.3) the natural multivariate extension of the Legendre polynomials. Indeed, it holds Lemma 2.5. The Lα family is orthogonal and complete in the Hilbert space L2([0; 1]d, R) equipped with the usual scalar product : Z ∀f, g ∈ L2([0; 1]d), hf, gi = f(u).g(u)du (2.7) [0;1]d Proof. The orthogonality is straightforward since if α = (i1, …, id) 6= α0 = (i01, …, i0d), there exists a subindex 1 ≤ k ≤ d such that ik 6= i0k and Z[0;1]d Lα(t1, …, td)Lα0(t1 , …, td)dt1 d Z 0 1 Lij (tj)Lij0 (tj)dtj = 0 (2.8).

Compatibility with univariate L-moments

The definition which we adopted for the definition of general L-moments is compatible with the similar one in dimension 1 since the univariate quantile is a transport.
Definition 2.8. Let ν be a real probability measure. The quantile is the generalized inverse of the distribution function : Q(t) = inf{x ∈ R s.t. ν((−∞; x]) ≥ t}. (2.16).
Proposition 2.9. If we denote by µ the uniform measure on [0; 1], then Q#µ = ν i.e. d Q(U) = X if U denotes the uniform law on [0; 1], and X denotes the random variable associated to ν. Proof. Let x ∈ R. We denote by F the cdf of X and by At the event At = {x ∈ R s.t. F (x) ≥ t} We then have Q(t) = inf At. We wish to prove : {t ∈ [0; 1] s.t. Q(t) ≤ x} = {t ∈ [0; 1] s.t. t ≤ F (x)} (2.17) We temporarily admit this assertion. Then P[Q(U) ≤ x] = P[U ≤ F (x)] = F (x).

Table of contents :

0.1 Considérations générales
0.2 Contexte d’application : détection de cibles lentes sur fouillis inhomogène .
0.3 Chapitre 1 : estimation pour des modèles définis par des conditions de Lmoments
0.4 Chapitre 2 : L-moments multivariés
0.5 Chapitre 3 : M-estimateur pour des modèles elliptiques
1 Estimation under L-moment condition models 
1.1 Motivation and notation
1.2 L-moments
1.2.1 Definition and characterizations
1.2.2 Estimation of L-moments
1.3 Models defined by moment and L-moment equations
1.3.1 Models defined by moment conditions
1.3.2 Models defined by L-moments conditions
1.3.3 Extension to models defined by order statistics conditions
1.4 Minimum of ‘-divergence estimators
1.4.1 ‘-divergences
1.4.2 M-estimates with L-moments constraints
1.5 Dual representations of the divergence under L-moment constraints
1.6 Reformulation of divergence projections and extensions
1.6.1 Minimum of an energy of deformation
1.6.2 Transportation functionals and multivariate generalization
1.6.3 Relation to elasticity theory
1.7 Asymptotic properties of the L-moment estimators
1.8 Numerical applications : Inference for Generalized Pareto family
1.8.1 Presentation
1.8.2 Moments and L-moments calculus
1.8.3 Simulations
1.9 Appendix
1.9.1 Proof of Lemma 1.1
1.9.2 Proof of Lemma 1.2
1.9.3 Proof of Proposition 1.4
1.9.4 Proof of Theorem 1.2
1.9.5 Proof of Theorem 1.3
2 Multivariate quantiles and multivariate L-moments 
2.1 Motivations and notations
2.2 Definition of multivariate L-moments and examples
2.2.1 General definition of multivariate L-moments
2.2.2 L-moments ratios
2.2.3 Compatibility with univariate L-moments
2.2.4 Relation with depth-based quantiles
2.3 Optimal transport
2.3.1 Formulation of the problem and main results
2.3.2 Optimal transport in dimension 1
2.3.3 Examples of monotone transports
2.4 L-moments issued from the monotone transport
2.4.1 Monotone transport from the uniform distribution on [0; 1]d
2.4.2 Monotone transport for copulas
2.4.3 Monotone transport from the standard Gaussian distribution
2.5 Rosenblatt transport and L-moments
2.5.1 General multivariate case
2.5.2 The case of bivariate L-moments of the form 1r and r1
2.6 Estimation of L-moments
2.6.1 Estimation of the Rosenblatt transport
2.6.2 Estimation of a monotone transport
2.7 Some extensions
2.7.1 Trimming
2.7.2 Hermite L-moments
2.8 Appendix
2.8.1 Proof of Theorem 2.6.2
3 M-estimators of the scatter matrix for stationary and non-stationary elliptical distributions 
3.1 Introduction
3.1.1 Motivation and notations
3.1.2 Models
3.1.3 Considered contamination
3.2 Stationary elliptical models : scale mixture of autoregressive vectors .
3.2.1 Burg method applied to Gaussian autoregressive vectors
3.2.2 Burg method for non-Gaussian vectors
3.3 Optimization on Riemannian manifolds
3.3.1 Riemannian metrics, geodesics and exponential map
3.3.2 Minimization of a function on a Riemannian manifold : Riemannian steepest descent
3.3.3 Steepest descent for the manifold of Hermitian positive definite matrices
3.3.4 Steepest descent in the Poincaré disk
3.4 Summary of the Burg algorithms
3.5 Regularization for Burg estimators
3.5.1 Regularized Gaussian Burg estimator
3.5.2 Regularized Normalized Burg estimator
3.5.3 Regularized Elliptical Burg estimator
3.5.4 Illustration
3.6 Elliptical models and robustness to heavy contamination
3.6.1 Two classes of robust M-estimators
3.6.2 Application to Angular Complex Gaussian distribution
3.7 Autoregressive modelization and robustness with respect to heavy contamination
3.7.1 Burg M-estimator
3.7.2 Geodesic Burg estimators
3.8 Radar detection for non-Gaussian noise
3.8.1 Test of hypotheses
3.8.2 GLRT detector
3.8.3 Capon detector
3.9 Applications for radar detection
3.9.1 Quality of the estimation
3.9.2 Scenario 0 : no outlier
3.9.3 Scenario 1 : multiple targets
3.9.4 Scenario 2 : clutter transition
3.9.5 Computation time
3.9.6 Simulations analysis
3.10 Appendix
3.10.1 Proof of the asymptotic bias of Log-Burg estimator



Related Posts