(Downloads - 0)
For more info about our services contact : help@bestpfe.com
Table of contents
Introduction and overview
1.1 Motivation for Contextual Machine Translation
1.2 Structure and detailed summary of this thesis
1.3 Publications related to this thesis
I State of the Art: Contextual Machine Translation
2 The Role of Context
2.1 Ambiguity and the problem of translation
2.1.1 Source language ambiguity
2.1.2 Cross-lingual meaning transfer ambiguity
2.1.3 Target language ambiguity
2.1.4 Human versus machine translation
2.2 The importance of context in MT
2.2.1 What is context?
2.2.2 Nature and use of context
2.3 Conclusion
3 Sentence-level Machine Translation
3.1 Statistical Machine Translation (SMT)
3.1.1 Word alignments
3.1.2 Phrase-based translation models
3.1.3 Domain adaptation
3.1.4 Successes and Limitations of SMT
3.2 Neural Machine Translation (NMT)
3.2.1 Neural networks for NLP
3.2.2 Sequence-to-sequence NMT
3.2.3 Sequence-to-sequence NMT with attention
3.2.4 Recent advances in NMT
3.2.5 Successes and limitations
3.3 Evaluating Machine Translation
3.3.1 Issues in human evaluation of MT quality
3.3.2 Standard automatic evaluation metrics
3.3.3 Discussion
4 Contextual Machine Translation
4.1 Evaluating contextual MT
4.1.1 Problems associated with automatic evaluation of context
4.1.2 MT metrics augmented with discourse information
4.1.3 Conclusion
4.2 Modelling context for MT
4.2.1 Modelling context for SMT
4.2.2 Modelling context for NMT
4.3 Translation using structured linguistic context
4.3.1 Anaphoric pronouns
4.3.2 Lexical choice
4.3.3 Discourse connectives
4.3.4 Whole document decoding
4.4 Translation using unstructured linguistic context
4.5 Translation using extra-linguistic context
4.6 Conclusion on evaluating contextual MT
II Using contextual information for Machine Translation: strategies and evaluation
5 Adapting translation to extra-linguistic context via pre-processing
5.1 Integrating speaker gender via domain adaptation
5.1.1 Annotating the The Big Bang Theory reproducible corpus
5.1.2 SMT models: baselines and adaptations
5.1.3 Manual analysis and discussion
5.1.4 Conclusion on data partitioning
5.2 Conclusion
6 Improving cohesion-based translation using post-processing
6.1 Preserving style in MT: generating English tag questions
6.1.1 Tag questions (TQs) and the diculty for MT
6.1.2 Improving TQ generation in MT into English: our post-edition approach
6.1.3 Results, analysis and discussion
6.1.4 Conclusion to our tag-question expriments
6.2 Anaphoric pronoun translation with linguistically motivated features
6.2.1 Classication system: description and motivation
6.2.2 Results, analysis and discussion
6.2.3 Conclusion to pronoun translation via post-edition
6.3 General conclusion on post-edition approaches
7 Context-aware translation models
7.1 Translating discourse phenomena with unstructured linguistic context .
7.1.1 Hand-crafted test sets for contextual MT evaluation
7.1.2 Modifying the NMT architecture
7.1.3 Evaluation results and analysis
7.1.4 Conclusion and perspectives
7.2 Contextual NMT with extra-linguistic context
7.2.1 Creation of extra-linguistically annotated data
7.2.2 Contextual strategies
7.2.3 Experiments
7.2.4 BLEU score results
7.2.5 Targeted evaluation of speaker gender
7.2.6 Conclusion and perspectives
7.3 Conclusion
8 DiaBLa: A corpus for the evaluation of contextual MT
8.1 Dialogue and human judgment collection protocol
8.1.1 Participants
8.1.2 Scenarios
8.1.3 Evaluation
8.1.4 MT systems and setup
8.2 Description of the corpus
8.2.1 Overview of translation successes and failures
8.2.2 Comparison with existing corpora
8.3 Evaluating contextual MT with the DiaBLa corpus
8.3.1 Overall MT quality
8.3.2 Focus on a discourse-level phenomenon
8.4 Perspectives
8.4.1 Language analysis of MT-assisted interaction
8.4.2 MT evaluation
Conclusion and Perspectives
9 Conclusion and Perspectives
9.1 Conclusion
9.1.1 Trends in contextual MT and the impact on our work
9.1.2 Review of our aims and contributions
9.2 Perspectives
9.2.1 Evaluation of MT
9.2.2 Interpretability of contextual NMT strategies
9.2.3 Contextual MT for low resource language pairs
9.2.4 Contextual MT to Multimodal MT
9.2.5 Conclusion: To the future and beyond the sentence
Appendices
A Context-aware translation models
A.1 Translating discourse phenomena with unstructured linguistic context .
A.1.1 Training and decoding parameters
A.1.2 Visualisation of hierarchical attention weights
A.2 Contextual NMT with extra-linguistic context
A.2.1 Experimental setup
B DiaBLa: A corpus for the evaluation of contextual MT
B.1 Role-play scenarios
B.2 Dialogue collection: Final evaluation form
Bibliography



