Author
Listed:
- Lijun Tan
(National Key Laboratory of Information Systems Engineering, National University of Defense Technology, Changsha 410073, China)
- Yanli Hu
(National Key Laboratory of Information Systems Engineering, National University of Defense Technology, Changsha 410073, China)
- Jianwei Cao
(National Key Laboratory of Information Systems Engineering, National University of Defense Technology, Changsha 410073, China)
- Zhen Tan
(National Key Laboratory of Information Systems Engineering, National University of Defense Technology, Changsha 410073, China)
Abstract
Event argument extraction is a crucial subtask of event extraction, which aims at extracting arguments that correspond to argument roles when given event types. The majority of current document-level event argument extraction works focus on extracting information for only one event at a time without considering the association among events; this is known as document-level single-event extraction. However, the interrelationship among arguments can yield mutual gains in their extraction. Therefore, in this paper, we propose AssocKD, an Association-aware Knowledge Distillation Method for Document-level Event Argument Extraction, which enables the enhancement of document-level multi-event extraction with event association knowledge. Firstly, we introduce an association-aware training task to extract unknown arguments with the given privileged knowledge of relevant arguments, obtaining an association-aware model that can construct both intra-event and inter-event relationships. Secondly, we adopt multi-teacher knowledge distillation to transfer such event association knowledge from the association-aware teacher models to the event argument extraction student model. Our proposed method, AssocKD, is capable of explicitly modeling and efficiently leveraging event association to enhance the extraction of multi-event arguments at the document level. We conduct experiments on RAMS and WIKIEVENTS datasets and observe a significant improvement, thus demonstrating the effectiveness of our method.
Suggested Citation
Lijun Tan & Yanli Hu & Jianwei Cao & Zhen Tan, 2024.
"AssocKD: An Association-Aware Knowledge Distillation Method for Document-Level Event Argument Extraction,"
Mathematics, MDPI, vol. 12(18), pages 1-20, September.
Handle:
RePEc:gam:jmathe:v:12:y:2024:i:18:p:2901-:d:1480037
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:12:y:2024:i:18:p:2901-:d:1480037. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.