Empowering News Recommendation With Pre-trained Language Models · Awesome LLM Papers Contribute to LLM-Bible

Empowering News Recommendation With Pre-trained Language Models

Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang. SIGIR '21: The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval 2021 – 98 citations

[Paper]    
Tools RAG

Personalized news recommendation is an essential technique for online news services. News articles usually contain rich textual content, and accurate news modeling is important for personalized news recommendation. Existing news recommendation methods mainly model news texts based on traditional text modeling methods, which is not optimal for mining the deep semantic information in news texts. Pre-trained language models (PLMs) are powerful for natural language understanding, which has the potential for better news modeling. However, there is no public report that show PLMs have been applied to news recommendation. In this paper, we report our work on exploiting pre-trained language models to empower news recommendation. Offline experimental results on both monolingual and multilingual news recommendation datasets show that leveraging PLMs for news modeling can effectively improve the performance of news recommendation. Our PLM-empowered news recommendation models have been deployed to the Microsoft News platform, and achieved significant gains in terms of both click and pageview in both English-speaking and global markets.

Similar Work