The conditional density assumption and credit contagion

Get Complete Project Material File(s) Now! »

General theory of stochastic processes

In this subsection, we mainly recall the fundamental notions of optional and predictable projections and the dual optional and predictable projections. First, recall that the optional-field O is the -algebra defined on R+ generated by all càdlàg (right-continuous with left limits) processes that are F adapted, and that the predictable -field P is the -algebra defined on R+ generated by all càg (left-continuous) processes that are F adapted. A time is said to be predictable if the stochastic interval [[0; [[ is predictable.

Optional and predictable projections

Theorem 1 Let X be a measurable process either positive or bounded. (i) There exists a unique up to indistinguishability optional process, called the optional projection of X and denoted oX, such that
E(XT 1fT <1g j FT ) = oXT 1fT <1g a.s. for every stopping time T . (ii) There exists a unique up to indistinguishability predictable process, called the pre-dictable projection of X and denoted pX, such that E(XT 1fT <1g j FT ) = pXT 1fT <1g a.s. for every predictable stopping time T .
The following characterization for predictable processes appears to be very useful. Theorem 2 If X is càdlàg adapted process, then X is predictable if and only if XT = 0 a.s. on fT < 1g for all totally inaccessible stopping times T and XT 1fT <1g is FT mea-surable for every predictable stopping time T .

Dual optional and predictable projections

If the projection operation defines rigourously what we would write formally as E(Xt j Ft), the dual projection allows to define rigourously the quantity R0t E(dAs j Fs) for a right-continuous increasing integrable process A, which is not necessary adapted. Theorem 3 Let A be a right-continuous increasing integrable process. Then (i) There exists a unique up to indistinguishability optional increasing process Ao, called dual optional projection of A, such that E(Z0 1 Z0 1 XsdAso) = E( oXsdAs) for every bounded measurable process X.
(ii) There exists a unique up to indistinguishability predictable increasing process Ap, called dual predictable projection of A, such that E(Z0 1 Z0 1 XsdAsp) = E( pXsdAs) for every bounded measurable process X.
The jumps of the dual predictable projection are given by the following lemma. Lemma 1 Let T be a predictable stopping time. Then Ap = E( A T j FT ) a.s. with p T the convention A1 = 0 If A is supposed to be optional, then Ap is the compensator of A.
Lemma 2 Let A be an optional process of integrable variation. Then Ap is the unique increasing predictable process such that A Ap is a martingale. Finally, we will also need the following result in section 1.3.3. Theorem 4 (i) Every predictable process of finite variation is locally integrable. (ii) An optional process A of finite variation is locally integrable if and only if there exists a predictable process A~ of finite variation such that A A~ is a local martingale which vanishes at 0. When it exists, A~ is unique. We say that A~ is the predictable compensator of A.

Azéma supermartingales and dual projections associated with random times

The following processes will be of crucial importance in filtration expansions with random times. Let be a random time and Z the optional projection onto F of 1[[0; [[. Then Z is an F supermartingale given by a càdlàg version of P ( > t j Ft) and is called Azéma supermartingale. Let a and A be respectively the dual optional and predictable projections of the process 1f tg. Then t = E(a1 j Ft) = at + Zt is a BMO martingale. Finally let Zt = Mt At be the Doob-Meyer decomposition of the supermartingale Z. The following holds. Lemma 3 (i) If avoids all F stopping times, then At = at is continuous. (ii) If all F martingales are continuous, a is predictable and consequently A = a. (iii) Under the two conditions above, Z is continuous.
The aim of the two next sections is to present the theory of filtration expansions. The ques-tion of interest is the one we already emphasized in the introduction. Do F semimartingales remain semimartingales in a filtration G containing F?
Definition 1 We say that hypothesis (H0) holds between F and G if every F semimartingale is a G semimartingale. When hypothesis (H0) does not hold between F and G we focus on finding conditions under which a given F semimartingale is a G semimartingale. Of course, one can restrict itself to F martingales. The reverse situation is known as filtration shrinkage. We do not cite the known results on filtration shrinkage here, but recall them throughout the thesis when needed. We refer to [50] for a nice paper on filtration shrinkage issues. However, we do recall Stricker’s theorem which we will intensively use in the next sections. Theorem 5 (Stricker) Let F G two filtrations. If X is a G semimartingale which is F adapted, then X is also an F semimartingale. As already mentioned in the introduction, there are essentially two types of filtration expansions: initial and progressive filtration expansion.

READ  Social e-commerce

Initial filtration expansion with a random variable

Let be a random variable. Define H = (Ht)t 0 where \ Ht = Fu _ ( ) u>t
The main theoretical result has been derived by Jacod which we state in the case where takes values in Rd. The conditional probabilities of given Ft for t 0 play a crucial role in this type of filtration expansion.
Assumption 1 (Jacod’s criterion) There exists a -finite measure on B(Rd) such that P ( 2 j Ft)(!) ( ) a.s. Without loss of generality, may be chosen as the law of . Under Assumption 1, the Ft conditional density P ( 2 du j Ft)(!) pt(u; !) = exists, and can be chosen so that (u; !; t) 7!pt(u; !) is càdlàg in t and measurable for the optional -field associated with the filtration Fb given by Fbt = \u>tB(Rd) Fu. See Lemma 1.8 in [72].
We provide the explicit decompositions using the following classical result by Jacod, see [72], Theorem 2.5. Theorem 6 Let M be an F local martingale, and assume Assumption 1 is satisfied. Then there exists a set B 2 B(Rd), with (B) = 0, such that (i) hp(u); Mi exists on f(t; !) j pt (u; !) > 0g for every u 2= B, (ii) there is an increasing predictable process A and an Fb predictable function kt(u; !) such that for every u 2= B, hp(u); Mit = R0t ks(u)ps (u)dAs on f(t; !) j pt (u; !) > 0g, R t R t (iii) 0 jks( )jdAs < 1 a.s. for every t 0 and Mt 0 ks( )dAs is an H local martingale.

Table of contents :

1 Filtration expansions and semimartingales 
1.1 Introduction
1.2 Mathematical preliminaries
1.2.1 General theory of stochastic processes
1.2.2 Initial filtration expansion with a random variable
1.2.3 Progressive filtration expansions
1.2.4 Weak convergence of filtrations and of -fields
1.3 Progressive filtration expansion under initial-type assumptions
1.3.1 Linking progressive and initial filtration expansion with one random time
1.3.2 The case of multiple non ranked random times
1.3.3 Connection to filtration shrinkage
1.4 Filtration expansion with processes
1.4.1 A semimartingale convergence result and applications to filtration expansion
1.4.2 Results based on Jacod’s type criterion for the increments of the process
1.4.3 Examples: Time reversed diffusions and Kohatsu-Higa’s example
1.5 Toward dynamic models for insider trading
1.5.1 Filtration expansion results based on a Jacod’s criterion for hitting times
1.5.2 Filtration expansion results based on a honest times assumption .
1.5.3 Insider models with arbitrage: Jeulin’s example and extensions .
2 Compensators of random times and credit contagion 
2.1 Introduction
2.2 Information induced credit contagion in structural models
2.2.1 The base model for a single time
2.2.2 Multiple firms : A conditional independence model
2.2.3 A first structural model with credit contagion effect
2.2.4 A structural credit contagion model in finite time horizon
2.2.5 Structural models with random default barriers
2.3 Credit contagion under the conditional density assumption
2.3.1 The conditional density assumption : the case of two non ranked random times
2.3.2 Extension to multiple non ranked random times
2.3.3 Modeling of conditional densities
2.3.4 An application to risk management
2.3.5 The conditional density assumption and credit contagion
2.3.6 A toy structural model and credit contagion
3 Bubbles: martingale theory and real time detection 
3.1 Introduction
3.2 The martingale theory of asset bubbles
3.3 Mathematical preliminaries
3.3.1 Strict local martingales
3.3.2 Estimation of the volatility function in diffusion models
3.4 How to detect an asset bubble in real time
3.4.1 The methodology
3.4.2 Method 1: Parametric Estimation
3.4.3 Method 2: RKHS theory
3.4.4 The dotcom bubble
3.5 Is there a bubble in LinkedIn’s stock price? A real case study
3.5.1 Real time detection : LinkedIn’s case
3.5.2 « Is there a bubble in LinkedIn’s stock price? » in the news .
4 Discretely and continuously sampled variance swaps 
4.1 Intoduction
4.2 Framework and mathematical preliminaries
4.2.1 Variance swaps
4.2.2 Mathematical preliminaries
4.3 Approximation using the quadratic variation
4.3.1 Finiteness of expectations
4.3.2 Bounds on the approximation error
4.4 Examples
4.4.1 Strict local martingales
4.4.2 Stochastic volatility of volatility
4.4.3 Time changed geometric Brownian motion
4.4.4 The 3=2-stochastic volatility model
Bibliography

GET THE COMPLETE PROJECT

Related Posts