Advanced search
2 files | 1.91 MB

An attentive neural architecture for joint segmentation and parsing and its application to real estate ads

Ioannis Bekoulis (UGent) , Johannes Deleu (UGent) , Thomas Demeester (UGent) and Chris Develder (UGent)
Author
Organization
Abstract
In processing human produced text using natural language processing (NLP) techniques, two fundamental subtasks that arise are (i) segmentation of the plain text into meaningful subunits (e.g., entities), and (ii) dependency parsing, to establish relations between subunits. Such structural interpretation of text provides essential building blocks for upstream expert system tasks: e.g., from interpreting textual real estate ads, one may want to provide an accurate price estimate and/or provide selection filters for end users looking for a particular property - which all could rely on knowing the types and number of rooms, etc. In this paper, we develop a relatively simple and effective neural joint model that performs both segmentation and dependency parsing together, instead of one after the other as in most state-of-the-art works. We will focus in particular on the real estate ad setting, aiming to convert an ad to a structured description, which we name property tree, comprising the tasks of (1) identifying important entities of a property (e.g., rooms) from classifieds and (2) structuring them into a tree format. In this work, we propose a new joint model that is able to tackle the two tasks simultaneously and construct the property tree by (i) avoiding the error propagation that would arise from the subtasks one after the other in a pipelined fashion, and (ii) exploiting the interactions between the subtasks. For this purpose, we perform an extensive comparative study of the pipeline methods and the new proposed joint model, reporting an improvement of over three percentage points in the overall edge F-1 score of the property tree. Also, we propose attention methods, to encourage our model to focus on salient tokens during the construction of the property tree. Thus we experimentally demonstrate the usefulness of attentive neural architectures for the proposed joint model, showcasing a further improvement of two percentage points in edge F-1 score for our application. While the results demonstrated are for the particular real estate setting, the model is generic in nature, and thus could be equally applied to other expert system scenarios requiring the general tasks of both (i) detecting entities (segmentation) and (ii) establishing relations among them (dependency parsing). (C) 2018 Elsevier Ltd. All rights reserved.
Keywords
NAMED ENTITY RECOGNITION, EXTRACTION, NETWORKS, Neural networks, Joint model, Relation extraction, Entity recognition, Dependency parsing

Downloads

  • (...).pdf
    • full text
    • |
    • UGent only
    • |
    • PDF
    • |
    • 1.10 MB
  • 7183 i.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 808.49 KB

Citation

Please use this url to cite or link to this publication:

Chicago
Bekoulis, Ioannis, Johannes Deleu, Thomas Demeester, and Chris Develder. 2018. “An Attentive Neural Architecture for Joint Segmentation and Parsing and Its Application to Real Estate Ads.” Expert Systems with Applications 102: 100–112.
APA
Bekoulis, I., Deleu, J., Demeester, T., & Develder, C. (2018). An attentive neural architecture for joint segmentation and parsing and its application to real estate ads. EXPERT SYSTEMS WITH APPLICATIONS, 102, 100–112.
Vancouver
1.
Bekoulis I, Deleu J, Demeester T, Develder C. An attentive neural architecture for joint segmentation and parsing and its application to real estate ads. EXPERT SYSTEMS WITH APPLICATIONS. Oxford: Pergamon-elsevier Science Ltd; 2018;102:100–12.
MLA
Bekoulis, Ioannis, Johannes Deleu, Thomas Demeester, et al. “An Attentive Neural Architecture for Joint Segmentation and Parsing and Its Application to Real Estate Ads.” EXPERT SYSTEMS WITH APPLICATIONS 102 (2018): 100–112. Print.
@article{8561548,
  abstract     = {In processing human produced text using natural language processing (NLP) techniques, two fundamental subtasks that arise are (i) segmentation of the plain text into meaningful subunits (e.g., entities), and (ii) dependency parsing, to establish relations between subunits. Such structural interpretation of text provides essential building blocks for upstream expert system tasks: e.g., from interpreting textual real estate ads, one may want to provide an accurate price estimate and/or provide selection filters for end users looking for a particular property - which all could rely on knowing the types and number of rooms, etc. In this paper, we develop a relatively simple and effective neural joint model that performs both segmentation and dependency parsing together, instead of one after the other as in most state-of-the-art works. We will focus in particular on the real estate ad setting, aiming to convert an ad to a structured description, which we name property tree, comprising the tasks of (1) identifying important entities of a property (e.g., rooms) from classifieds and (2) structuring them into a tree format. In this work, we propose a new joint model that is able to tackle the two tasks simultaneously and construct the property tree by (i) avoiding the error propagation that would arise from the subtasks one after the other in a pipelined fashion, and (ii) exploiting the interactions between the subtasks. For this purpose, we perform an extensive comparative study of the pipeline methods and the new proposed joint model, reporting an improvement of over three percentage points in the overall edge F-1 score of the property tree. Also, we propose attention methods, to encourage our model to focus on salient tokens during the construction of the property tree. Thus we experimentally demonstrate the usefulness of attentive neural architectures for the proposed joint model, showcasing a further improvement of two percentage points in edge F-1 score for our application. While the results demonstrated are for the particular real estate setting, the model is generic in nature, and thus could be equally applied to other expert system scenarios requiring the general tasks of both (i) detecting entities (segmentation) and (ii) establishing relations among them (dependency parsing). (C) 2018 Elsevier Ltd. All rights reserved.},
  author       = {Bekoulis, Ioannis and Deleu, Johannes and Demeester, Thomas and Develder, Chris},
  issn         = {0957-4174},
  journal      = {EXPERT SYSTEMS WITH APPLICATIONS},
  keyword      = {NAMED ENTITY RECOGNITION,EXTRACTION,NETWORKS,Neural networks,Joint model,Relation extraction,Entity recognition,Dependency parsing},
  language     = {eng},
  pages        = {100--112},
  publisher    = {Pergamon-elsevier Science Ltd},
  title        = {An attentive neural architecture for joint segmentation and parsing and its application to real estate ads},
  url          = {http://dx.doi.org/10.1016/j.eswa.2018.02.031},
  volume       = {102},
  year         = {2018},
}

Altmetric
View in Altmetric
Web of Science
Times cited: