Advanced search
1 file | 1.10 MB Add to list

Injecting knowledge base information into end-to-end joint entity and relation extraction and coreference resolution

Author
Organization
Project
Abstract
We consider a joint information extraction(IE) model, solving named entity recognition, coreference resolution and relation extraction jointly over the whole document. In particular, we study how to inject information from a knowledge base (KB) in such IE model, based on unsupervised entity linking. The used KB entity representations are learned from either(i) hyperlinked text documents (Wikipedia), or(ii) a knowledge graph (Wikidata), and ap-pear complementary in raising IE performance. Representations of corresponding entity linking (EL) candidates are added to text span representations of the input document, and we experiment with (i) taking a weighted average of the EL candidate representations based on their prior (in Wikipedia), and (ii) using an attention scheme over the EL candidate list. Results demonstrate an increase of up to 5%F1-score for the evaluated IE tasks on two datasets. Despite a strong performance of the prior-based model, our quantitative and qualitative analysis reveals the advantage of using the attention-based approach.

Downloads

  • published.pdf
    • full text (Published version)
    • |
    • open access
    • |
    • PDF
    • |
    • 1.10 MB

Citation

Please use this url to cite or link to this publication:

MLA
Verlinden, Severine, et al. “Injecting Knowledge Base Information into End-to-End Joint Entity and Relation Extraction and Coreference Resolution.” Findings of the Association for Computational Linguistics : ACL-IJCNLP 2021, edited by Chengqing Zong et al., Association for Computational Linguistics (ACL), 2021, pp. 1952–57, doi:10.18653/v1/2021.findings-acl.171.
APA
Verlinden, S., Zaporojets, K., Deleu, J., Demeester, T., & Develder, C. (2021). Injecting knowledge base information into end-to-end joint entity and relation extraction and coreference resolution. In C. Zong, F. Xia, W. Li, & R. Navigli (Eds.), Findings of the Association for Computational Linguistics : ACL-IJCNLP 2021 (pp. 1952–1957). https://doi.org/10.18653/v1/2021.findings-acl.171
Chicago author-date
Verlinden, Severine, Klim Zaporojets, Johannes Deleu, Thomas Demeester, and Chris Develder. 2021. “Injecting Knowledge Base Information into End-to-End Joint Entity and Relation Extraction and Coreference Resolution.” In Findings of the Association for Computational Linguistics : ACL-IJCNLP 2021, edited by Chengqing Zong, Fei Xia, Wenjie Li, and Roberto Navigli, 1952–57. Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.171.
Chicago author-date (all authors)
Verlinden, Severine, Klim Zaporojets, Johannes Deleu, Thomas Demeester, and Chris Develder. 2021. “Injecting Knowledge Base Information into End-to-End Joint Entity and Relation Extraction and Coreference Resolution.” In Findings of the Association for Computational Linguistics : ACL-IJCNLP 2021, ed by. Chengqing Zong, Fei Xia, Wenjie Li, and Roberto Navigli, 1952–1957. Association for Computational Linguistics (ACL). doi:10.18653/v1/2021.findings-acl.171.
Vancouver
1.
Verlinden S, Zaporojets K, Deleu J, Demeester T, Develder C. Injecting knowledge base information into end-to-end joint entity and relation extraction and coreference resolution. In: Zong C, Xia F, Li W, Navigli R, editors. Findings of the Association for Computational Linguistics : ACL-IJCNLP 2021. Association for Computational Linguistics (ACL); 2021. p. 1952–7.
IEEE
[1]
S. Verlinden, K. Zaporojets, J. Deleu, T. Demeester, and C. Develder, “Injecting knowledge base information into end-to-end joint entity and relation extraction and coreference resolution,” in Findings of the Association for Computational Linguistics : ACL-IJCNLP 2021, Online (Bangkok, Thailand), 2021, pp. 1952–1957.
@inproceedings{8716944,
  abstract     = {{We consider a joint information extraction(IE) model, solving named entity recognition, coreference resolution and relation extraction jointly over the whole document. In particular, we study how to inject information from a knowledge base (KB) in such IE model, based on unsupervised entity linking. The used KB entity representations are learned from either(i) hyperlinked text documents (Wikipedia), or(ii) a knowledge graph (Wikidata), and ap-pear complementary in raising IE performance. Representations of corresponding entity linking (EL) candidates are added to text span representations of the input document, and we experiment with (i) taking a weighted average of the EL candidate representations based on their prior (in Wikipedia), and (ii) using an attention scheme over the EL candidate list. Results demonstrate an increase of up to 5%F1-score for the evaluated IE tasks on two datasets. Despite a strong performance of the prior-based model, our quantitative and qualitative analysis reveals the advantage of using the attention-based approach.}},
  author       = {{Verlinden, Severine and Zaporojets, Klim and Deleu, Johannes and Demeester, Thomas and Develder, Chris}},
  booktitle    = {{Findings of the Association for Computational Linguistics : ACL-IJCNLP 2021}},
  editor       = {{Zong, Chengqing and Xia, Fei and Li, Wenjie and Navigli, Roberto}},
  isbn         = {{9781954085541}},
  language     = {{eng}},
  location     = {{Online (Bangkok, Thailand)}},
  pages        = {{1952--1957}},
  publisher    = {{Association for Computational Linguistics (ACL)}},
  title        = {{Injecting knowledge base information into end-to-end joint entity and relation extraction and coreference resolution}},
  url          = {{http://doi.org/10.18653/v1/2021.findings-acl.171}},
  year         = {{2021}},
}

Altmetric
View in Altmetric