MT6: Multilingual Pretrained Text-to-text Transformer With Translation Pairs | Awesome LLM Papers Add your paper to Awesome LLM Papers

MT6: Multilingual Pretrained Text-to-text Transformer With Translation Pairs

Zewen Chi, Li Dong, Shuming Ma, Shaohan Huang Xian-Ling Mao, Heyan Huang, Furu Wei . Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021 – 51 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Datasets EMNLP Evaluation Interdisciplinary Approaches Model Architecture Neural Machine Translation Question Answering Training Techniques

Multilingual T5 (mT5) pretrains a sequence-to-sequence model on massive monolingual texts, which has shown promising results on many cross-lingual tasks. In this paper, we improve multilingual text-to-text transfer Transformer with translation pairs (mT6). Specifically, we explore three cross-lingual text-to-text pre-training tasks, namely, machine translation, translation pair span corruption, and translation span corruption. In addition, we propose a partially non-autoregressive objective for text-to-text pre-training. We evaluate the methods on eight multilingual benchmark datasets, including sentence classification, named entity recognition, question answering, and abstractive summarization. Experimental results show that the proposed mT6 improves cross-lingual transferability over mT5.

Similar Work