Transforming Lattices into Matrices

somdn_product_page

(Downloads - 0)

Catégorie :

For more info about our services contact : help@bestpfe.com

Table of contents

1 Project Frame 
1.1 Introduction
1.2 Work Environment
1.2.1 LORIA, Orpailleur and Multispeech
1.2.2 Inria Project Lab HyAIAI
1.2.3 Tools, Repository and Testbed
1.3 Basic Background in Formal Concept Analysis
1.3.1 Formal Contexts and Formal Concepts
1.3.2 Formal Concept Lattices
1.3.3 Formal Concept Lattices Generation Algorithms
1.4 Basic Background in Deep Learning
1.4.1 Neural Networks
1.4.2 Deep Learning Algorithms
1.4.3 Usual Loss Functions
1.4.4 Binary Encoding and Softmax
1.4.5 Major Neural Networks Architectures
1.4.6 Auto-Encoders and Embeddings
1.4.7 Metric Learning
1.5 Problem Statement
2 Initial Approach: Graph Generation 
2.1 State of the Art of Graph Modeling
2.1.1 Node-Centered Approaches
2.1.2 Whole Graph Approaches
2.1.3 Advantages of GraphRNN for Lattice Generation
2.2 Transforming Lattices into Matrices
2.2.1 Data Generation and Dataset
2.2.2 From Breath-First Search to Level Ordering
2.2.3 Encoding the Lattice Adjacency
2.2.4 Encoding the Concepts
2.3 Using GraphRNN for Lattices Modeling
2.3.1 GraphRNN
2.3.2 GraphRNN Constrained VAE
2.4 Preliminary Experiments and Change of Approach
2.4.1 Reconstruction PerformanceWithout Features
2.4.2 Formal Concept Representation and Change of Approach
3 Bag of Attributes: Embeddings for Formal Contexts 
3.1 State of the Art: FCA2VEC
3.1.1 Object2vec and Attribute2vec
3.1.2 Closure2vec
3.1.3 Evaluation
3.2 Bag of Attributes Model
3.2.1 Architecture
3.2.2 Training Objective
3.3 Training
3.3.1 Training Process
3.3.2 Training Dataset
3.3.3 Data Augmentation
3.3.4 IssuesWith KL Divergence
3.4 Experiments
3.4.1 Reconstruction Performance
3.4.2 Metric Learning Performance
3.4.3 Experiments on Real-World Datasets
4 Second Approach: Intents Generation 
4.1 Pilot Experiments
4.1.1 Variational Auto-Encoder Architecture
4.1.2 Convolutional Neural Network Architecture
4.1.3 Recurrent Neural Network Architecture and Attention Mechanisms
4.1.4 Conclusions on the Tested Architectures
4.2 Concept Number Upper Bound Prediction
4.3 Intents, Cover and Order Relation Generation
4.4 Training and First Experiments
5 Conclusion and Discussion 
Bibliography

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *