Charagram: Embedding Words And Sentences Via Character N-grams | Awesome LLM Papers Add your paper to Awesome LLM Papers

Charagram: Embedding Words And Sentences Via Character N-grams

John Wieting, Mohit Bansal, Kevin Gimpel, Karen Livescu . Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing 2016 – 182 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Compositional Generalization EMNLP Evaluation Interdisciplinary Approaches Neural Machine Translation Variational Autoencoders

We present Charagram embeddings, a simple approach for learning character-based compositional models to embed textual sequences. A word or sentence is represented using a character n-gram count vector, followed by a single nonlinear transformation to yield a low-dimensional embedding. We use three tasks for evaluation: word similarity, sentence similarity, and part-of-speech tagging. We demonstrate that Charagram embeddings outperform more complex architectures based on character-level recurrent and convolutional neural networks, achieving new state-of-the-art performance on several similarity tasks.

Similar Work