Logo image
Home Academic units
Sign in
Modal Dependency Parsing via Language Model Priming
Conference proceeding

Modal Dependency Parsing via Language Model Priming

Jiarui Yao, Nianwen Xue, Bonan Min and ASSOC COMPUTAT LINGUIST
NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, pp.2913-2919
01/01/2022

Abstract

Computer Science Computer Science, Artificial Intelligence Computer Science, Interdisciplinary Applications Linguistics Science & Technology Social Sciences Technology
The task of modal dependency parsing aims to parse a text into its modal dependency structure, which is a representation for the factuality of events in the text. We design a modal dependency parser that is based on priming pre-trained language models, and evaluate the parser on two data sets. Compared to baselines, we show an improvement of 2.6% in F-score for English and 4.6% for Chinese. To the best of our knowledge, this is also the first work on Chinese modal dependency parsing.

Metrics

16 Record Views

Details