Hidden Conditional Random Fields

somdn_product_page

(Downloads - 0)

Catégorie :

For more info about our services contact : help@bestpfe.com

Table of contents

1 Introduction 
2 Statistical models for time series modeling 
2.1 Statistical models
2.1.1 Notations
2.1.2 Supervised learning
2.1.3 Model types
2.2 Tasks and evaluation measures
2.2.1 Isolated classification
2.2.2 Recognition
2.2.3 Synthesis
2.3 Generative Markov models
2.3.1 Hidden Markov Models
2.3.2 Handling variability with HMMs
2.4 Discriminative Markov models
2.4.1 Conditional Random Fields
2.4.2 Hidden Conditional Random Fields
2.5 Conclusion
3 Contextual Hidden Markov Models 
3.1 Introduction
3.2 Single Gaussian Contextual HMM
3.2.1 Mean parameterization
3.2.2 Covariance parameterization
3.2.3 Transitions parameterization
3.2.4 Bayesian perspective
3.3 Training
3.3.1 With covariances parameterized
3.3.2 With transitions parameterized
3.3.3 Dynamic context
3.3.4 Gaussian mixtures
3.3.5 Tuning the gradient step size
3.4 CHMM relative to similar approaches
3.4.1 Variable Parametric HMMs
3.4.2 Maximum Likelihood Linear Regression
3.4.3 Context dependent modeling
3.5 Application to the classification of handwritten characters
3.5.1 Dataset
3.5.2 Preliminary results
3.5.3 Extended results
3.6 Conclusion
4 Contextual Hidden Conditional Random Fields 
4.1 Introduction
4.2 Discriminative training of Hidden Markov Models
4.2.1 MMI
4.2.2 MCE
4.2.3 MWE/MPE
4.2.4 Discussion
4.3 Exploiting contextual information in Hidden Conditional Random Fields
4.3.1 HCRF as a generalization of HMM
4.3.2 Contextual HCRFs
4.3.3 Training Contextual HCRFs
4.3.4 Experiments
4.4 Conclusion
5 Exploiting Contextual Markov Models for synthesis 
5.1 Motivation
5.2 Using HMMs for synthesis
5.2.1 Improved synthesis using non stationary HMMs
5.2.2 Synthesis with constraints
5.3 Speech to motion synthesis, an application
5.3.1 Related work
5.4 Speech to motion synthesis using Contextual Markovian models
5.4.1 Parameterizations
5.4.2 Training
5.4.3 Synthesis
5.4.4 Experiments
5.5 Conclusion
6 Combining contextual variables 
6.1 Introduction
6.2 Dropout regularization
6.2.1 Dropout in CHMMs
6.3 Multistream combination of variables
6.3.1 Experimental setup
6.3.2 Contextual variables
6.3.3 CHMMs
6.3.4 Multistream CHMMs
6.4 Conclusion
7 Toward Transfer Learning 
7.1 Design of a global model
7.1.1 Using a class code as contextual variables
7.1.2 Task & dataset
7.1.3 Preliminary results with one-hot class coding
7.1.4 Using a distributed representation of class as contextual variables .
7.1.5 Retraining discriminatively
7.2 Dynamic Factor Graphs
7.2.1 Continuous state space models
7.2.2 Analogy with Dynamic Factor Graphs
7.3 Conclusion
8 Conclusion & Perspectives 
Appendices

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *