Pairwise Supervised Contrastive Learning Of Sentence Representations | Awesome LLM Papers Add your paper to Awesome LLM Papers

Pairwise Supervised Contrastive Learning Of Sentence Representations

Dejiao Zhang, Shang-Wen Li, Wei Xiao, Henghui Zhu, Ramesh Nallapati, Andrew O. Arnold, Bing Xiang . Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021 – 41 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Compositional Generalization Datasets EMNLP Few Shot Fine Tuning Interdisciplinary Approaches

Many recent successes in sentence representation learning have been achieved by simply fine-tuning on the Natural Language Inference (NLI) datasets with triplet loss or siamese loss. Nevertheless, they share a common weakness: sentences in a contradiction pair are not necessarily from different semantic categories. Therefore, optimizing the semantic entailment and contradiction reasoning objective alone is inadequate to capture the high-level semantic structure. The drawback is compounded by the fact that the vanilla siamese or triplet losses only learn from individual sentence pairs or triplets, which often suffer from bad local optima. In this paper, we propose PairSupCon, an instance discrimination based approach aiming to bridge semantic entailment and contradiction understanding with high-level categorical concept encoding. We evaluate PairSupCon on various downstream tasks that involve understanding sentence semantics at different granularities. We outperform the previous state-of-the-art method with (10%)–(13%) averaged improvement on eight clustering tasks, and (5%)–(6%) averaged improvement on seven semantic textual similarity (STS) tasks.

Similar Work