Multi-task Neural Models For Translating Between Styles Within And Across Languages | Awesome LLM Papers Add your paper to Awesome LLM Papers

Multi-task Neural Models For Translating Between Styles Within And Across Languages

Xing Niu, Sudha Rao, Marine Carpuat . Arxiv 2018 – 47 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Compositional Generalization Interdisciplinary Approaches Neural Machine Translation

Generating natural language requires conveying content in an appropriate style. We explore two related tasks on generating text of varying formality: monolingual formality transfer and formality-sensitive machine translation. We propose to solve these tasks jointly using multi-task learning, and show that our models achieve state-of-the-art performance for formality transfer and are able to perform formality-sensitive translation without being explicitly trained on style-annotated translation examples.

Similar Work