TY - GEN
T1 - Pairwise Representation Learning for Event Coreference
AU - Yu, Xiaodong
AU - Yin, Wenpeng
AU - Roth, Dan
N1 - Funding Information:
This work was supported by Contract FA8750-19-2-1004 with the US Defense Advanced Research Projects Agency (DARPA), the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via IARPA Contract No. 2019-19051600006 under the BETTER Program, and a Focused Award from Google. Approved for Public Release, Distribution Unlimited. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies, either expressed or implied, of ODNI, IARPA, the Department of Defense, or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for governmental purposes notwithstanding any copyright annotation therein.
Publisher Copyright:
© 2022 Association for Computational Linguistics.
PY - 2022
Y1 - 2022
N2 - Natural Language Processing tasks such as resolving the coreference of events require understanding the relations between two text snippets. These tasks are typically formulated as (binary) classification problems over independently induced representations of the text snippets. In this work, we develop a Pairwise Representation Learning (PAIRWISERL) scheme for the event mention pairs, in which we jointly encode a pair of text snippets so that the representation of each mention in the pair is induced in the context of the other one. Furthermore, our representation supports a finer, structured representation of the text snippet to facilitate encoding events and their arguments. We show that PAIRWISERL, despite its simplicity, outperforms the prior state-of-the-art event coreference systems on both cross-document and within-document event coreference benchmarks. We also conduct in-depth analysis in terms of the improvement and the limitation of pairwise representation so as to provide insights for future work.
AB - Natural Language Processing tasks such as resolving the coreference of events require understanding the relations between two text snippets. These tasks are typically formulated as (binary) classification problems over independently induced representations of the text snippets. In this work, we develop a Pairwise Representation Learning (PAIRWISERL) scheme for the event mention pairs, in which we jointly encode a pair of text snippets so that the representation of each mention in the pair is induced in the context of the other one. Furthermore, our representation supports a finer, structured representation of the text snippet to facilitate encoding events and their arguments. We show that PAIRWISERL, despite its simplicity, outperforms the prior state-of-the-art event coreference systems on both cross-document and within-document event coreference benchmarks. We also conduct in-depth analysis in terms of the improvement and the limitation of pairwise representation so as to provide insights for future work.
UR - http://www.scopus.com/inward/record.url?scp=85139087330&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85139087330&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85139087330
T3 - *SEM 2022 - 11th Joint Conference on Lexical and Computational Semantics, Proceedings of the Conference
SP - 69
EP - 78
BT - *SEM 2022 - 11th Joint Conference on Lexical and Computational Semantics, Proceedings of the Conference
A2 - Nastase, Vivi
A2 - Pavlick, Ellie
A2 - Pilehvar, Mohammad Taher
A2 - Camacho-Collados, Jose
A2 - Raganato, Alessandro
PB - Association for Computational Linguistics (ACL)
T2 - 11th Joint Conference on Lexical and Computational Semantics, *SEM 2022
Y2 - 14 July 2022 through 15 July 2022
ER -