(Downloads - 0)
For more info about our services contact : help@bestpfe.com
Table of contents
1 Introduction
1.1 Context
1.1.1 Tasks
1.1.2 Types of Graphs
1.1.3 Outline
1.2 Tasks Studied during the thesis
1.2.1 The Classification Task
1.2.2 The Forecasting Task
1.2.3 The Ranking Task
1.3 Learning Representations
1.4 Contributions
1.4.1 Learning Deterministic Representations for Heterogeneous Graph Node Classification (Chapter 4)
1.4.2 Learning Gaussian Representations
Learning Gaussian Representations for Heterogeneous Graph Node
Classification (Chapter 5)
Learning Gaussian Representations for Relational Time Series Forecasting
(Chapter 6)
Learning Gaussian Representations for Ranking (Chapter 7)
1.5 Thesis Organization
2 Learning Representations for Relational Data
2.1 Introduction
2.2 Learning Representations in Graphs
2.2.1 Inductive and Transductive Algorithms
Induction
Transduction
2.2.2 Supervised, Unsupervised and Semi-Supervised Learning
Unsupervised Learning
Supervised Learning
Semi-Supervised Learning
Our Contributions
2.3 Learning Deterministic Representations
2.3.1 Unsupervised Models
Learning from Context
2.3.2 Supervised and Semi-Supervised Models
Learning Nodes and Relationship Representations in Knowledge Bases 22
From Unlabeled to Labeled Data in Classification
2.3.3 Other Applications of Representation Learning
2.4 Capturing uncertainties in representations
Bayesian Approaches
Direct Approaches
2.5 Conclusion
I Learning Deterministic Representations and Application to Classification
3 State of the Art: Graph Node Classification
3.1 Introduction
3.2 Graph Node Classification
3.2.1 Simple Transductive Model for Graph Node Classification
3.2.2 Other Graph Node ClassificationWorks
3.3 Conclusion
4 Learning Deterministic Representations for Classification
4.1 Introduction
4.2 Learning Representation for Graph Node Classification
4.2.1 Model
Loss Function
Classifier
Transductive graph model
4.2.2 Prior Parameters and Learning Algorithms
Prior Parameters
Learned Relation Specific Parameters
4.2.3 Algorithm
4.2.4 Experiments
Datasets
Construction of the LastFM Datasets
Evidence For Heterogeneous Node Reciprocal Influence
ComparisonWith Other Models
Evaluation Measures And Protocol
Results
Importance Of The Relations’Weights
Label correlation on the LastFM2 Dataset
4.3 Conclusion
II Learning Gaussian Representations and Applications
5 Learning Gaussian Representations for Classification
5.1 Introduction
5.2 Graph Node Classification with Gaussian Embeddings
5.2.1 Model
Loss Function
Classifier
Graph Embedding
Prior Parameters and Learned Relation Specific Parameters
Algorithm
5.2.2 Experiments
Datasets
Results
Qualitative Discussion
5.3 Conclusion
6 Learning Gaussian Representations for Relational Time Series Forecasting
6.1 Introduction
6.1.1 Relational Time Series
6.1.2 Contribution
6.2 RelatedWork
6.3 Relational Time Series Forecasting with Gaussian Embeddings
6.3.1 Model
Notations and Task Formal Description
6.3.2 Informal description
Model Definition
Impact of minimizing the KL-divergence on predicted values
Inference and Time Complexity
Variants
6.3.3 Experiments
Datasets
Baselines
Experimental protocol
Results
6.4 Conclusion
7 Learning Gaussian Representations for Collaborative Filtering
7.1 Introduction
7.1.1 Recommender Systems and Uncertainty
7.1.2 Contribution
7.2 Learning Gaussian Embeddings for Collaborative Filtering
7.2.1 Model
The Gaussian Embeddings Ranking Model
Loss function
Ordering items
7.2.2 Experiments
Analysis
7.3 Conclusion
III Conclusion
8 Conclusion et perspectives
8.1 Conclusion
8.1.1 Contributions
Graph Node Classification
Relational Time Series Forecasting
Collaborative Filtering
8.1.2 Learning Hyperparameters
8.1.3 From Deterministic to Gaussian Representations
8.2 Perspectives
8.2.1 Classification Task
8.2.2 Forecasting Task
8.2.3 Ranking Task
8.2.4 Learning Gaussian Embeddings for Knowledge Bases
Bibliography




