A Survey On Long Text Modeling With Transformers | Awesome LLM Papers Add your paper to Awesome LLM Papers

A Survey On Long Text Modeling With Transformers

Zican Dong, Tianyi Tang, Junyi Li, Wayne Xin Zhao . Arxiv 2023 – 230 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Applications Compositional Generalization Content Enrichment Image Text Integration Interactive Environments Interdisciplinary Approaches Memory & Context Model Architecture Multimodal Semantic Representation Neural Machine Translation Productivity Enhancement Question Answering Survey Paper

Modeling long texts has been an essential technique in the field of natural language processing (NLP). With the ever-growing number of long documents, it is important to develop effective modeling methods that can process and analyze such texts. However, long texts pose important research challenges for existing text models, with more complex semantics and special characteristics. In this paper, we provide an overview of the recent advances on long texts modeling based on Transformer models. Firstly, we introduce the formal definition of long text modeling. Then, as the core content, we discuss how to process long input to satisfy the length limitation and design improved Transformer architectures to effectively extend the maximum context length. Following this, we discuss how to adapt Transformer models to capture the special characteristics of long texts. Finally, we describe four typical applications involving long text modeling and conclude this paper with a discussion of future directions. Our survey intends to provide researchers with a synthesis and pointer to related work on long text modeling.

Similar Work