Earlier Attention? Aspect-aware LSTM For Aspect-based Sentiment Analysis | Awesome LLM Papers Add your paper to Awesome LLM Papers

Earlier Attention? Aspect-aware LSTM For Aspect-based Sentiment Analysis

Bowen Xing, Lejian Liao, Dandan Song, Jingang Wang, Fuzheng Zhang, Zhongyuan Wang, Heyan Huang . Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019 – 49 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Affective Computing Datasets IJCAI Interdisciplinary Approaches Model Architecture Multimodal Semantic Representation

Aspect-based sentiment analysis (ABSA) aims to predict fine-grained sentiments of comments with respect to given aspect terms or categories. In previous ABSA methods, the importance of aspect has been realized and verified. Most existing LSTM-based models take aspect into account via the attention mechanism, where the attention weights are calculated after the context is modeled in the form of contextual vectors. However, aspect-related information may be already discarded and aspect-irrelevant information may be retained in classic LSTM cells in the context modeling process, which can be improved to generate more effective context representations. This paper proposes a novel variant of LSTM, termed as aspect-aware LSTM (AA-LSTM), which incorporates aspect information into LSTM cells in the context modeling stage before the attention mechanism. Therefore, our AA-LSTM can dynamically produce aspect-aware contextual representations. We experiment with several representative LSTM-based models by replacing the classic LSTM cells with the AA-LSTM cells. Experimental results on SemEval-2014 Datasets demonstrate the effectiveness of AA-LSTM.

Similar Work