Sentibert: A Transferable Transformer-based Architecture For Compositional Sentiment Semantics | Awesome LLM Papers Contribute to Awesome LLM Papers

Sentibert: A Transferable Transformer-based Architecture For Compositional Sentiment Semantics

da Yin, Tao Meng, Kai-Wei Chang . Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020 – 116 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
ACL Uncategorized

We propose SentiBERT, a variant of BERT that effectively captures compositional sentiment semantics. The model incorporates contextualized representation with binary constituency parse tree to capture semantic composition. Comprehensive experiments demonstrate that SentiBERT achieves competitive performance on phrase-level sentiment classification. We further demonstrate that the sentiment composition learned from the phrase-level annotations on SST can be transferred to other sentiment analysis tasks as well as related tasks, such as emotion classification tasks. Moreover, we conduct ablation studies and design visualization methods to understand SentiBERT. We show that SentiBERT is better than baseline approaches in capturing negation and the contrastive relation and model the compositional sentiment semantics.

Similar Work