N-LTP: An Open-source Neural Language Technology Platform For Chinese | Awesome LLM Papers Contribute to Awesome LLM Papers

N-LTP: An Open-source Neural Language Technology Platform For Chinese

Wanxiang Che, Yunlong Feng, Libo Qin, Ting Liu . Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations 2021 – 98 citations

[Code] [Paper]   Search on Google Scholar   Search on Semantic Scholar
EMNLP Efficiency Has Code Tools

We introduce \texttt{N-LTP}, an open-source neural language technology platform supporting six fundamental Chinese NLP tasks: {lexical analysis} (Chinese word segmentation, part-of-speech tagging, and named entity recognition), {syntactic parsing} (dependency parsing), and {semantic parsing} (semantic dependency parsing and semantic role labeling). Unlike the existing state-of-the-art toolkits, such as \texttt{Stanza}, that adopt an independent model for each task, \texttt{N-LTP} adopts the multi-task framework by using a shared pre-trained model, which has the advantage of capturing the shared knowledge across relevant Chinese tasks. In addition, a knowledge distillation method \cite{DBLP:journals/corr/abs-1907-04829} where the single-task model teaches the multi-task model is further introduced to encourage the multi-task model to surpass its single-task teacher. Finally, we provide a collection of easy-to-use APIs and a visualization tool to make users to use and view the processing results more easily and directly. To the best of our knowledge, this is the first toolkit to support six Chinese NLP fundamental tasks. Source code, documentation, and pre-trained models are available at https://github.com/HIT-SCIR/ltp.

Similar Work