Construction of xed-sized point processes from exchangeable variables 

Get Complete Project Material File(s) Now! »

Stability properties of finite DPPs

In this section, the ground space X is assumed to be nite. We focus on proving some of the stability properties of DPPs under di erent set operations f(X ) or conditioning X j?. We rst derive the inclusion-exclusion principle to express quantities of the form P[A X ; X \ B = ;] for general point processes X . Then, we use Theorem 1.B.2 to spe-cialize the inclusion-exclusion principle to nite DPPs, which reveals the determinantal structure of the previous quantities. As a direct application we can explicitly derive the likelihood P[X = A] dear to people using the L-ensemble viewpoint and the inclusion probabilities of the complementary process X n X which proves to be a DPP with correlation kernel I K.

Sampling from projection DPPs

The motivation for this section is twofold. First, we present generic sampling methods that give a unifying view on the potentially di erent models they come from. Second, recalling that Hermitian DPPs can be seen as a mixture of orthogonal projection DPPs, cf. Theorem 1.2.4, sampling methods for orthogonal projection DPPs serve the more gen-eral purpose of sampling Hermitian DPPs, cf. Section 2.2.
In this section we lay the ground material for projection DPP sam-pling and emphasize the geometrical interpretation of the generic sam-pling scheme for orthogonal projection DPPs.

The continuous case

Apart from some speci c instances like the projection DPPs arising in random matrix theory as the eigenvalues of random matrices, the usual sampling scheme for continuous projection DPPs relies on the chain rule. Each of the N points forming the resulting sample fx1; : : : ; xN g is drawn sequential conditionally on the previously selected points so that (x1; : : : ; xN ) has distribution (1.1.15). The fact that the order the points were selected does not matter comes from the fact that the joint distribution of the points is invariant by permutation of its coordinates, i.e., the variables x1; : : : ; xN are exchangeable, see also Proposition 1.1.8.
Projection DPP sampling can be done using the chain rule. Given an oracle to evaluate the kernel K(x; y) and sample from the successive conditional distributions.
Proposition 2.1.1 (Projection DPP sampling given the kernel). To generate a sample fx1; : : : ; xN g from a projection DPP( ; K), as de ned by Proposition 1.1.8, it is su cient to sequentially sample x1; : : : ; xN using the following chain rule scheme and forget the or-der the points were selected.

Finite DPPs de ned by their correlation kernel

We survey di erent exact sampling methods for nite DPPs de ned by their correlation kernel K, which is not assumed to be of projection type. The rst ones by Poulson (2019) and Launay, Galerne, and Desolneux (2018) are based on matrix factorization techniques and apply generically, even to non-Hermitian correlation kernels.
First, let us try to be generic and consider a correlation kernel K which is not assumed Hermitian nor of projection type.
From a probabilistic viewpoint, a first way of think-ing the sampling of DPP(K) is to perform a sequential bottom-up chain rule on sets, i.e., starting from the empty set, each item 1; : : : ; M is decided in turn to be added or excluded to form the nal sample X (Launay, Galerne, and Desolneux, 2018, Section Using the factorization K = UUT.
For any two square matrices Tr[AB] = 1T[A BT]1 where A B corresponds to the entrywise product and 1 denotes the vector with unit entries of the ap-propriate dimension.
which do not depend on J but would cost respectively O(jC‘jN) and O(jC‘jN2) to evaluate on the y. For the nodes at the top of the tree jC‘j M, this would imply a linear depen-dency on M, which is exactly what we want to avoid! exact dpp sampling 49 2.2). This can be formalized as a top-down exploration of the binary probability tree displayed in Figure 2.2.

READ  Distinction between two types of all-optical switching mechanisms via magnetotransport measurements 

Table of contents :

0 Introduction 
0.1 General introduction
0.2 Contributions
0.3 Outline of the manuscript
1 Determinantal point processes 
1.1 Denitions
1.2 How to construct a DPP?
Appendices
1.A Construction of xed-sized point processes from exchangeable variables
1.B Classical matrix results
1.C Cauchy-Binet formulas
1.D Stability properties of nite DPPs
1.E Expectation and variance of linear statistics
2 Exact DPP sampling
2.1 Sampling from projection DPPs
2.1.1 The continuous case
2.1.2 The nite case
2.2 Exact sampling from non-projection DPPs
2.2.1 Finite DPPs dened by their correlation kernel
2.2.2 Finite DPPs dened by their likelihood kernel
2.2.3 The continuous case
Appendices
2.A Specialization of the sequential sampler for Hermitian DPPs
2.B A note on the sequential thinning procedure
3 Approximate DPP sampling 
3.1 Kernel approximation methods
3.2 Monte Carlo Markov chain methods
3.3 The zonotope viewpoint on nite projection DPPs
3.3.1 Hit-and-run and the Simplex Algorithm
3.3.2 From Volume to Squared Volume
3.4 Experiments
3.4.1 Non-uniform Spanning Trees
3.4.2 Text Summarization
3.5 Discussion
4 Application of DPP sampling to Monte Carlo integration 
4.1 Standard quadrature
4.2 The multivariate Jacobi ensemble
4.3 Description of the two DPP-based estimators
4.3.1 A natural estimator
4.3.2 The Ermakov-Zolotukhin estimator
4.4 Sampling from orthogonal projection DPPs
4.4.1 Sampling from the multivariate Jacobi ensemble
4.5 Empirical investigation
4.5.1 The bump experiment
4.5.2 Integrating sums of eigenfunctions
4.5.3 Further experiments
4.6 Discussion
Appendices
4.A Further experiments
4.A.1 Integrating absolute value
4.A.2 Integrating Heaviside
4.A.3 Integrating cosine
4.A.4 Integrating a mixture of smooth and non smooth functions
5 Fast sampling from -ensembles 
5.1 Classical -ensembles and their tridiagonal models
5.2 Atomic measures, moments and Jacobi matrices
5.2.1 Orthogonal polynomials and Jacobi matrices
5.2.2 Orthogonal polynomials and moments
5.3 Making the change of variables
5.4 Proving the three classical tridiagonal models
5.4.1 The HE and its tridiagonal model
5.4.2 The LE and its tridiagonal model
5.4.3 The JE and its tridiagonal model
5.5 Gibbs sampling tridiagonal models associated to polynomial potentials
5.5.1 Sampling from the conditionals
5.5.2 Example simulations and empirical study of the convergence
5.6 Conclusion
Discussion
Resume en francais
Bibliography 

GET THE COMPLETE PROJECT

Related Posts