Leveraging Unsupervised Pretraining

somdn_product_page

(Downloads - 0)

Catégorie :

For more info about our services contact : help@bestpfe.com

Table of contents

I PROLOGUE 
1 INTRODUCTION
1.1 Thesis Outline
1.2 Publications
II TEXT GENERATION WITHOUT RETRIEVAL 
2 TEXT GENERATION
2.1 The Need for Text Generation
2.2 Structure of this Section
3 TEXT-TO-TEXT GENERATION 
3.1 Text-to-Text Generation
3.2 Multilingual Sentence Simplification
4 MEANING REPRESENTATION-TO-TEXT GENERATION
4.1 Structured Input to Text Generation
4.2 Abstract Meaning Representations
4.3 Multilingual AMR-to-Text Generation
III TEXT GENERATION WITH RETRIEVAL 
5 RETRIEVAL FOR KNOWLEDGE-BASED TEXT GENERATION
5.1 The Need for Knowledge
5.2 Challenges
5.3 Structure of this Section
6 KNOWLEDGE FROM A SINGLE DOCUMENT
6.1 Motivation: Document-Level Knowledge
6.2 Fact Checking as a Knowledge-Based Text Generation Task
6.3 Generating Fact Checking Briefs
7 SCALING KNOWLEDGE ACCESS TO MULTIPLE DOCUMENTS IN WIKIPEDIA
7.1 Motivation: Knowledge from Multiple Documents
7.2 Dialogue as a Knowledge-Based Text Generation Task
7.3 Augmenting Transformers with KNN-Based Composite Memory
8 SCALING KNOWLEDGE ACCESS TO THE OPEN WEB
8.1 Motivation: Knowledge from the Open Web
8.2 Wikipedia Article Writing as a Knowledge-Based Text Generation Task
8.3 Generating Biographies for Marginalized Groups on Wikipedia
9 KNOWLEDGE ON THE WEB, IN STRUCTURED FORM
9.1 Motivation: Knowledge in Structured Form
9.2 Long-form Question Answering as a Knowledge- Based Text Generation Task
9.3 Using Local Knowledge Graph Construction to Scale Seq2Seq Models to Multi-Document Inputs
IV EPILOGUE 
CONCLUSION 
BIBLIOGRAPHY

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *