Memory Augmented Multi-instance Contrastive Predictive Coding For Sequential Recommendation | Awesome LLM Papers Add your paper to Awesome LLM Papers

Memory Augmented Multi-instance Contrastive Predictive Coding For Sequential Recommendation

Ruihong Qiu, Zi Huang, Hongzhi Yin . 2021 IEEE International Conference on Data Mining (ICDM) 2021 – 41 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Compositional Generalization Datasets Evaluation Tools Training Techniques

The sequential recommendation aims to recommend items, such as products, songs and places, to users based on the sequential patterns of their historical records. Most existing sequential recommender models consider the next item prediction task as the training signal. Unfortunately, there are two essential challenges for these methods: (1) the long-term preference is difficult to capture, and (2) the supervision signal is too sparse to effectively train a model. In this paper, we propose a novel sequential recommendation framework to overcome these challenges based on a memory augmented multi-instance contrastive predictive coding scheme, denoted as MMInfoRec. The basic contrastive predictive coding (CPC) serves as encoders of sequences and items. The memory module is designed to augment the auto-regressive prediction in CPC to enable a flexible and general representation of the encoded preference, which can improve the ability to capture the long-term preference. For effective training of the MMInfoRec model, a novel multi-instance noise contrastive estimation (MINCE) loss is proposed, using multiple positive samples, which offers effective exploitation of samples inside a mini-batch. The proposed MMInfoRec framework falls into the contrastive learning style, within which, however, a further finetuning step is not required given that its contrastive training task is well aligned with the target recommendation task. With extensive experiments on four benchmark datasets, MMInfoRec can outperform the state-of-the-art baselines.

Similar Work