Hi-transformer: Hierarchical Interactive Transformer For Efficient And Effective Long Document Modeling | Awesome LLM Papers Add your paper to Awesome LLM Papers

Hi-transformer: Hierarchical Interactive Transformer For Efficient And Effective Long Document Modeling

Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang . Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers) 2021 – 45 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Datasets Efficiency Evaluation Interdisciplinary Approaches Model Architecture Productivity Enhancement

Transformer is important for text modeling. However, it has difficulty in handling long documents due to the quadratic complexity with input text length. In order to handle this problem, we propose a hierarchical interactive Transformer (Hi-Transformer) for efficient and effective long document modeling. Hi-Transformer models documents in a hierarchical way, i.e., first learns sentence representations and then learns document representations. It can effectively reduce the complexity and meanwhile capture global document context in the modeling of each sentence. More specifically, we first use a sentence Transformer to learn the representations of each sentence. Then we use a document Transformer to model the global document context from these sentence representations. Next, we use another sentence Transformer to enhance sentence modeling using the global document context. Finally, we use hierarchical pooling method to obtain document embedding. Extensive experiments on three benchmark datasets validate the efficiency and effectiveness of Hi-Transformer in long document modeling.

Similar Work