Adamct: Adaptive Mixture Of Cnn-transformer For Sequential Recommendation | Awesome LLM Papers Add your paper to Awesome LLM Papers

Adamct: Adaptive Mixture Of Cnn-transformer For Sequential Recommendation

Juyong Jiang, Peiyan Zhang, Yingtao Luo, Chaozhuo Li, Jae Boum Kim, Kai Zhang, Senzhang Wang, Xing Xie, Sunghun Kim . CIKM '23: The 32nd ACM International Conference on Information and Knowledge Management 2023 – 41 citations

[Code] [Paper]   Search on Google Scholar   Search on Semantic Scholar
CIKM Efficiency Ethics & Fairness Has Code Model Architecture Productivity Enhancement

Sequential recommendation (SR) aims to model users dynamic preferences from a series of interactions. A pivotal challenge in user modeling for SR lies in the inherent variability of user preferences. An effective SR model is expected to capture both the long-term and short-term preferences exhibited by users, wherein the former can offer a comprehensive understanding of stable interests that impact the latter. To more effectively capture such information, we incorporate locality inductive bias into the Transformer by amalgamating its global attention mechanism with a local convolutional filter, and adaptively ascertain the mixing importance on a personalized basis through layer-aware adaptive mixture units, termed as AdaMCT. Moreover, as users may repeatedly browse potential purchases, it is expected to consider multiple relevant items concurrently in long-/short-term preferences modeling. Given that softmax-based attention may promote unimodal activation, we propose the Squeeze-Excitation Attention (with sigmoid activation) into SR models to capture multiple pertinent items (keys) simultaneously. Extensive experiments on three widely employed benchmarks substantiate the effectiveness and efficiency of our proposed approach. Source code is available at https://github.com/juyongjiang/AdaMCT.

Similar Work