Deeper Text Understanding For IR With Contextual Neural Language Modeling | Awesome LLM Papers Contribute to Awesome LLM Papers

Deeper Text Understanding For IR With Contextual Neural Language Modeling

Zhuyun Dai, Jamie Callan . Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval 2019 – 233 citations

[Paper]   Search on Google Scholar   Search on Semantic Scholar
SIGIR Uncategorized

Neural networks provide new possibilities to automatically learn complex language patterns and query-document relations. Neural IR models have achieved promising results in learning query-document relevance patterns, but few explorations have been done on understanding the text content of a query or a document. This paper studies leveraging a recently-proposed contextual neural language model, BERT, to provide deeper text understanding for IR. Experimental results demonstrate that the contextual text representations from BERT are more effective than traditional word embeddings. Compared to bag-of-words retrieval models, the contextual language model can better leverage language structures, bringing large improvements on queries written in natural languages. Combining the text understanding ability with search knowledge leads to an enhanced pre-trained BERT model that can benefit related search tasks where training data are limited.

Similar Work