POLITICS: Pretraining With Same-story Article Comparison For Ideology Prediction And Stance Detection | Awesome LLM Papers Add your paper to Awesome LLM Papers

POLITICS: Pretraining With Same-story Article Comparison For Ideology Prediction And Stance Detection

Yujian Liu, Xinliang Frederick Zhang, David Wegsman, Nick Beauchamp, Lu Wang . Findings of the Association for Computational Linguistics: NAACL 2022 2022 – 40 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
Datasets Fine Tuning NAACL Training Techniques

Ideology is at the core of political science research. Yet, there still does not exist general-purpose tools to characterize and predict ideology across different genres of text. To this end, we study Pretrained Language Models using novel ideology-driven pretraining objectives that rely on the comparison of articles on the same story written by media of different ideologies. We further collect a large-scale dataset, consisting of more than 3.6M political news articles, for pretraining. Our model POLITICS outperforms strong baselines and the previous state-of-the-art models on ideology prediction and stance detection tasks. Further analyses show that POLITICS is especially good at understanding long or formally written texts, and is also robust in few-shot learning scenarios.

Similar Work