首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Collective prompt tuning with relation inference for document-level relation extraction
Institution:1. College of Economics, Shenzhen University, Shenzhen, Guangdong 518060, China;2. School of Management, Huazhong University of Science and Technology, Wuhan 430074, China;1. Earthquake Research Center, Ferdowsi University of Mashhad, Iran;2. Department of Knowledge and Information Science, Ferdowsi University of Mashhad, Iran;1. School of Management Science and Engineering, Nanjing University of Information Science and Technology, Nanjing 210044, China;2. Department of Economics, University of Reading, Reading RG6 6UD, UK
Abstract:Document-level relation extraction (RE) aims to extract the relation of entities that may be across sentences. Existing methods mainly rely on two types of techniques: Pre-trained language models (PLMs) and reasoning skills. Although various reasoning methods have been proposed, how to elicit learnt factual knowledge from PLMs for better reasoning ability has not yet been explored. In this paper, we propose a novel Collective Prompt Tuning with Relation Inference (CPT-RI) for Document-level RE, that improves upon existing models from two aspects. First, considering the long input and various templates, we adopt a collective prompt tuning method, which is an update-and-reuse strategy. A generic prompt is first encoded and then updated with exact entity pairs for relation-specific prompts. Second, we introduce a relation inference module to conduct global reasoning overall relation prompts via constrained semantic segmentation. Extensive experiments on two publicly available benchmark datasets demonstrate the effectiveness of our proposed CPT-RI as compared to the baseline model (ATLOP (Zhou et al., 2021)), which improve the 0.57% on the DocRED dataset, 2.20% on the CDR dataset, and 2.30 on the GDA dataset in the F1 score. In addition, further ablation studies also verify the effects of the collective prompt tuning and relation inference.
Keywords:Natural language processing  Document-level relation extraction  Prompt-tuning  Various templates  Global reasoning
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号