Advanced search
1 file | 346.34 KB Add to list

Emotional RobBERT and insensitive BERTje : combining transformers and affect lexica for Dutch emotion detection

Luna De Bruyne (UGent) , Orphée De Clercq (UGent) and Veronique Hoste (UGent)
Author
Organization
Project
Abstract
In a first step towards improving Dutch emotion detection, we try to combine the Dutch transformer models BERTje and RobBERT with lexicon-based methods. We propose two architectures: one in which lexicon information is directly injected into the transformer model and a meta-learning approach where predictions from transformers are combined with lexicon features. The models are tested on 1,000 Dutch tweets and 1,000 captions from TV-shows which have been manually annotated with emotion categories and dimensions. We find that RobBERT clearly outperforms BERTje, but that directly adding lexicon information to transformers does not improve performance. In the meta-learning approach, lexicon information does have a positive effect on BERTje, but not on RobBERT. This suggests that more emotional information is already contained within this latter language model.
Keywords
lt3

Downloads

  • 2021.wassa-1.27.pdf
    • full text (Published version)
    • |
    • open access
    • |
    • PDF
    • |
    • 346.34 KB

Citation

Please use this url to cite or link to this publication:

MLA
De Bruyne, Luna, et al. “Emotional RobBERT and Insensitive BERTje : Combining Transformers and Affect Lexica for Dutch Emotion Detection.” Proceedings of the Eleventh Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (EACL 2021), Association for Computational Linguistics, 2021, pp. 257–63.
APA
De Bruyne, L., De Clercq, O., & Hoste, V. (2021). Emotional RobBERT and insensitive BERTje : combining transformers and affect lexica for Dutch emotion detection. Proceedings of the Eleventh Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (EACL 2021), 257–263. Association for Computational Linguistics.
Chicago author-date
De Bruyne, Luna, Orphée De Clercq, and Veronique Hoste. 2021. “Emotional RobBERT and Insensitive BERTje : Combining Transformers and Affect Lexica for Dutch Emotion Detection.” In Proceedings of the Eleventh Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (EACL 2021), 257–63. Association for Computational Linguistics.
Chicago author-date (all authors)
De Bruyne, Luna, Orphée De Clercq, and Veronique Hoste. 2021. “Emotional RobBERT and Insensitive BERTje : Combining Transformers and Affect Lexica for Dutch Emotion Detection.” In Proceedings of the Eleventh Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (EACL 2021), 257–263. Association for Computational Linguistics.
Vancouver
1.
De Bruyne L, De Clercq O, Hoste V. Emotional RobBERT and insensitive BERTje : combining transformers and affect lexica for Dutch emotion detection. In: Proceedings of the Eleventh Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (EACL 2021). Association for Computational Linguistics; 2021. p. 257–63.
IEEE
[1]
L. De Bruyne, O. De Clercq, and V. Hoste, “Emotional RobBERT and insensitive BERTje : combining transformers and affect lexica for Dutch emotion detection,” in Proceedings of the Eleventh Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (EACL 2021), Online, 2021, pp. 257–263.
@inproceedings{8727558,
  abstract     = {{In a first step towards improving Dutch emotion detection, we try to combine the Dutch transformer models BERTje and RobBERT with lexicon-based methods. We propose two architectures: one in which lexicon information is directly injected into the transformer model and a meta-learning approach where predictions from transformers are combined with lexicon features. The models are tested on 1,000 Dutch tweets and 1,000 captions from TV-shows which have been manually annotated with emotion categories and dimensions. We find that RobBERT clearly outperforms BERTje, but that directly adding lexicon information to transformers does not improve performance. In the meta-learning approach, lexicon information does have a positive effect on BERTje, but not on RobBERT. This suggests that more emotional information is already contained within this latter language model.}},
  author       = {{De Bruyne, Luna and De Clercq, Orphée and Hoste, Veronique}},
  booktitle    = {{Proceedings of the Eleventh Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis (EACL 2021)}},
  isbn         = {{9781954085183}},
  keywords     = {{lt3}},
  language     = {{eng}},
  location     = {{Online}},
  pages        = {{257--263}},
  publisher    = {{Association for Computational Linguistics}},
  title        = {{Emotional RobBERT and insensitive BERTje : combining transformers and affect lexica for Dutch emotion detection}},
  url          = {{https://aclanthology.org/2021.wassa-1.27/}},
  year         = {{2021}},
}