MGNC-CNN: A Simple Approach To Exploiting Multiple Word Embeddings For Sentence Classification | Awesome LLM Papers Add your paper to Awesome LLM Papers

MGNC-CNN: A Simple Approach To Exploiting Multiple Word Embeddings For Sentence Classification

Ye Zhang, Stephen Roller, Byron Wallace . Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies 2016 – 103 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Model Architecture

We introduce a novel, simple convolution neural network (CNN) architecture - multi-group norm constraint CNN (MGNC-CNN) that capitalizes on multiple sets of word embeddings for sentence classification. MGNC-CNN extracts features from input embedding sets independently and then joins these at the penultimate layer in the network to form a final feature vector. We then adopt a group regularization strategy that differentially penalizes weights associated with the subcomponents generated from the respective embedding sets. This model is much simpler than comparable alternative architectures and requires substantially less training time. Furthermore, it is flexible in that it does not require input word embeddings to be of the same dimensionality. We show that MGNC-CNN consistently outperforms baseline models.

Similar Work