Advanced search
2 files | 5.62 MB Add to list

Conveying emotions to robots through touch and sound

Qiaoqiao Ren (UGent) , Remko Proesmans (UGent) , Frederick Bossuyt (UGent) , Jan Vanfleteren (UGent) , Francis wyffels (UGent) and Tony Belpaeme (UGent)
Author
Organization
Abstract
Human emotions can be conveyed through nuanced touch gestures. However, there is a lack of understanding of how consistently emotions can be conveyed to robots through touch. This study explores the consistency of touch-based emotional expression toward a robot by integrating tactile and auditory sensory reading of affective haptic expressions. We developed a piezoresistive pressure sensor and used a microphone to mimic touch and sound channels, respectively. In a study with 28 participants, each conveyed 10 emotions to a robot using spontaneous touch gestures. Our findings reveal a statistically significant consistency in emotion expression among participants. However, some emotions obtained low intraclass correlation values. Additionally, certain emotions with similar levels of arousal or valence did not exhibit significant differences in the way they were conveyed. We subsequently constructed a multi-modal integrating touch and audio features to decode the 10 emotions. A support vector machine (SVM) model demonstrated the highest accuracy, achieving 40% for 10 classes, with “Attention” being the most accurately conveyed emotion at a balanced accuracy of 87.65 %.
Keywords
Tactile interaction, affective computing, emotion decoding, multi-modal

Downloads

  • (...).pdf
    • full text (Accepted manuscript)
    • |
    • UGent only (changes to open access on 2026-03-25)
    • |
    • PDF
    • |
    • 1.92 MB
  • (...).pdf
    • full text (Accepted manuscript)
    • |
    • UGent only
    • |
    • PDF
    • |
    • 3.70 MB

Citation

Please use this url to cite or link to this publication:

MLA
Ren, Qiaoqiao, et al. “Conveying Emotions to Robots through Touch and Sound.” SOCIAL ROBOTICS, ICSR + AI 2024, PT III, vol. 15563, 2025, pp. 329–39, doi:10.1007/978-981-96-3525-2_28.
APA
Ren, Q., Proesmans, R., Bossuyt, F., Vanfleteren, J., wyffels, F., & Belpaeme, T. (2025). Conveying emotions to robots through touch and sound. SOCIAL ROBOTICS, ICSR + AI 2024, PT III, 15563, 329–339. https://doi.org/10.1007/978-981-96-3525-2_28
Chicago author-date
Ren, Qiaoqiao, Remko Proesmans, Frederick Bossuyt, Jan Vanfleteren, Francis wyffels, and Tony Belpaeme. 2025. “Conveying Emotions to Robots through Touch and Sound.” In SOCIAL ROBOTICS, ICSR + AI 2024, PT III, 15563:329–39. https://doi.org/10.1007/978-981-96-3525-2_28.
Chicago author-date (all authors)
Ren, Qiaoqiao, Remko Proesmans, Frederick Bossuyt, Jan Vanfleteren, Francis wyffels, and Tony Belpaeme. 2025. “Conveying Emotions to Robots through Touch and Sound.” In SOCIAL ROBOTICS, ICSR + AI 2024, PT III, 15563:329–339. doi:10.1007/978-981-96-3525-2_28.
Vancouver
1.
Ren Q, Proesmans R, Bossuyt F, Vanfleteren J, wyffels F, Belpaeme T. Conveying emotions to robots through touch and sound. In: SOCIAL ROBOTICS, ICSR + AI 2024, PT III. 2025. p. 329–39.
IEEE
[1]
Q. Ren, R. Proesmans, F. Bossuyt, J. Vanfleteren, F. wyffels, and T. Belpaeme, “Conveying emotions to robots through touch and sound,” in SOCIAL ROBOTICS, ICSR + AI 2024, PT III, Odense, Denmark, 2025, vol. 15563, pp. 329–339.
@inproceedings{01JQBKNBYG3S2MEABC1FG6Z0K1,
  abstract     = {{Human emotions can be conveyed through nuanced touch gestures. However, there is a lack of understanding of how consistently emotions can be conveyed to robots through touch. This study explores the consistency of touch-based emotional expression toward a robot by integrating tactile and auditory sensory reading of affective haptic expressions. We developed a piezoresistive pressure sensor and used a microphone to mimic touch and sound channels, respectively. In a study with 28 participants, each conveyed 10 emotions to a robot using spontaneous touch gestures. Our findings reveal a statistically significant consistency in emotion expression among participants. However, some emotions obtained low intraclass correlation values. Additionally, certain emotions with similar levels of arousal or valence did not exhibit significant differences in the way they were conveyed. We subsequently constructed a multi-modal integrating touch and audio features to decode the 10 emotions. A support vector machine (SVM) model demonstrated the highest accuracy, achieving 40% for 10 classes, with “Attention” being the most accurately conveyed emotion at a balanced accuracy of 87.65 %.}},
  author       = {{Ren, Qiaoqiao and Proesmans, Remko and Bossuyt, Frederick and Vanfleteren, Jan and wyffels, Francis and Belpaeme, Tony}},
  booktitle    = {{SOCIAL ROBOTICS, ICSR + AI 2024, PT III}},
  isbn         = {{9789819635245}},
  issn         = {{0302-9743}},
  keywords     = {{Tactile interaction,affective computing,emotion decoding,multi-modal}},
  language     = {{eng}},
  location     = {{Odense, Denmark}},
  pages        = {{329--339}},
  title        = {{Conveying emotions to robots through touch and sound}},
  url          = {{http://doi.org/10.1007/978-981-96-3525-2_28}},
  volume       = {{15563}},
  year         = {{2025}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: