Logo image
Modal Dependency Parsing via Biaffine Attention with Self-Loop
Conference paper   Open access

Modal Dependency Parsing via Biaffine Attention with Self-Loop

Jayeol Chun and Nianwen Xue
The Annual Meeting of Computational Linguistics, 2025 (San Diego, CA , 07/27/2025–08/01/2025)

Abstract

Computational Linguistics
A modal dependency structure represents a web of connections between events and sources of information in a document that allows for tracing of who-said-what with what levels of certainty, thereby establishing factuality in an event-centric approach. Obtaining such graphs defines the task of modal dependency parsing, which involves event and source identification along with the modal relations between them. In this paper, we propose a simple yet effective solution based on biaffine attention that specifically optimizes against the domain-specific challenges of modal dependency parsing by integrating self-loop. We show that our approach , when coupled with data augmentation by leveraging the Large Language Models to translate annotations from one language to another , outperforms the previous state-of-the-art on English and Chinese datasets by 2% and 4% respectively.
pdf
2025.findings-acl.1093357.08 kBDownloadView
Open Access

Metrics

1 Record Views

Details

Logo image