TY - GEN
T1 - Syntax Controlled Knowledge Graph-to-Text Generation with Order and Semantic Consistency
AU - Liu, Jin
AU - Fan, Chongfeng
AU - Zhou, Fengyu
AU - Xu, Huijuan
N1 - Funding Information:
Prof. Zhou’s group is supported by The National Key R & D Program of China (Grant No. 2017YFB1302400), Jinan ’20 New Colleges and Universities’ Funded Scientific Research Leader Studio (2021GXRC079), Major Agricultural Applied Technological Innovation Projects of Shan-dong Province (SD2019NJ014), Shandong Natural Science Foundation(ZR2019MF064), Beijing Advanced Innovation Center for Intelligent Robots and Systems (2019IRS19). In addition, the authors thank the anonymous reviewers for providing valuable comments to improve this paper.
Publisher Copyright:
© Findings of the Association for Computational Linguistics: NAACL 2022 - Findings.
PY - 2022
Y1 - 2022
N2 - The knowledge graph (KG) stores a large amount of structural knowledge, while it is not easy for direct human understanding. Knowledge graph-to-text (KG-to-text) generation aims to generate easy-to-understand sentences from the KG, and at the same time, maintains semantic consistency between generated sentences and the KG. Existing KG-to-text generation methods phrase this task as a sequence-tosequence generation task with linearized KG as input and consider the consistency issue of the generated texts and KG through a simple selection between decoded sentence word and KG node word at each time step. However, the linearized KG order is commonly obtained through a heuristic search without data-driven optimization. In this paper, we optimize the knowledge description order prediction under the order supervision extracted from the caption and further enhance the consistency of the generated sentences and KG through syntactic and semantic regularization. We incorporate the Part-of-Speech (POS) syntactic tags to constrain the positions to copy words from the KG and employ a semantic context scoring function to evaluate the semantic fitness for each word in its local context when decoding each word in the generated sentence. Extensive experiments are conducted on two datasets,WebNLG and DART, and achieve state-of-the-art performances. Our code is now public available.
AB - The knowledge graph (KG) stores a large amount of structural knowledge, while it is not easy for direct human understanding. Knowledge graph-to-text (KG-to-text) generation aims to generate easy-to-understand sentences from the KG, and at the same time, maintains semantic consistency between generated sentences and the KG. Existing KG-to-text generation methods phrase this task as a sequence-tosequence generation task with linearized KG as input and consider the consistency issue of the generated texts and KG through a simple selection between decoded sentence word and KG node word at each time step. However, the linearized KG order is commonly obtained through a heuristic search without data-driven optimization. In this paper, we optimize the knowledge description order prediction under the order supervision extracted from the caption and further enhance the consistency of the generated sentences and KG through syntactic and semantic regularization. We incorporate the Part-of-Speech (POS) syntactic tags to constrain the positions to copy words from the KG and employ a semantic context scoring function to evaluate the semantic fitness for each word in its local context when decoding each word in the generated sentence. Extensive experiments are conducted on two datasets,WebNLG and DART, and achieve state-of-the-art performances. Our code is now public available.
UR - http://www.scopus.com/inward/record.url?scp=85137348476&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85137348476&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85137348476
T3 - Findings of the Association for Computational Linguistics: NAACL 2022 - Findings
SP - 1278
EP - 1291
BT - Findings of the Association for Computational Linguistics
PB - Association for Computational Linguistics (ACL)
T2 - 2022 Findings of the Association for Computational Linguistics: NAACL 2022
Y2 - 10 July 2022 through 15 July 2022
ER -