Attention-based Neural Text Segmentation | Awesome LLM Papers Add your paper to Awesome LLM Papers

Attention-based Neural Text Segmentation

Pinkesh Badjatiya, Litton J Kurisinkel, Manish Gupta, Vasudeva Varma . Lecture Notes in Computer Science 2018 – 63 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Datasets Evaluation Model Architecture Question Answering

Text segmentation plays an important role in various Natural Language Processing (NLP) tasks like summarization, context understanding, document indexing and document noise removal. Previous methods for this task require manual feature engineering, huge memory requirements and large execution times. To the best of our knowledge, this paper is the first one to present a novel supervised neural approach for text segmentation. Specifically, we propose an attention-based bidirectional LSTM model where sentence embeddings are learned using CNNs and the segments are predicted based on contextual information. This model can automatically handle variable sized context information. Compared to the existing competitive baselines, the proposed model shows a performance improvement of ~7% in WinDiff score on three benchmark datasets.

Similar Work