Ghent University Academic Bibliography

Advanced

Toward E-motion based music retrieval: a study of affective gesture recognition

Denis Amelynck UGent, Maarten Grachten, Leon van Noorden and Marc Leman UGent (2012) IEEE TRANSACTIONS ON AFFECTIVE COMPUTING. 3(2). p.250-259
abstract
The widespread availability of digitized music collections and mobile music players have enabled us to listen to music during many of our daily activities, such as physical exercise, commuting, relaxation, and many people enjoy this. A practical problem that comes along with the wish to listen to music is that of music retrieval, the selection of desired music from a music collection. In this paper, we propose a new approach to facilitate music retrieval. Modern smart phones are commonly used as music players and are already equipped with inertial sensors that are suitable for obtaining motion information. In the proposed approach, emotion is derived automatically from arm gestures and is used to query a music collection. We derive predictive models for valence and arousal from empirical data, gathered in an experimental setup where inertial data recorded from arm movements are coupled to musical emotion. Part of the experiment is a preliminary study confirming that human subjects are generally capable of recognizing affect from arm gestures. Model validation in the main study confirmed the predictive capabilities of the models.
Please use this url to cite or link to this publication:
author
organization
year
type
journalArticle (original)
publication status
published
subject
keyword
EMOTIONAL RESPONSES, PERCEPTION, LIGHT DISPLAYS, POINT-LIGHT, MOVEMENT, EXPRESSIONS, FEATURES, IMAGERY, MODELS, CORTEX, Affect detection, expressive gestures, music retrieval, human computer interfaces
journal title
IEEE TRANSACTIONS ON AFFECTIVE COMPUTING
IEEE trans. affect. comput.
volume
3
issue
2
pages
250 - 259
Web of Science type
Article
Web of Science id
000323627000011
ISSN
1949-3045
DOI
10.1109/T-AFFC.2011.39
language
English
UGent publication?
yes
classification
A1
copyright statement
I have transferred the copyright for this publication to the publisher
VABB id
c:vabb:337481
VABB type
VABB-1
id
2972735
handle
http://hdl.handle.net/1854/LU-2972735
date created
2012-08-17 13:29:13
date last changed
2014-02-06 15:50:07
@article{2972735,
  abstract     = {The widespread availability of digitized music collections and mobile music players have enabled us to listen to music during many of our daily activities, such as physical exercise, commuting, relaxation, and many people enjoy this. A practical problem that comes along with the wish to listen to music is that of music retrieval, the selection of desired music from a music collection. In this paper, we propose a new approach to facilitate music retrieval. Modern smart phones are commonly used as music players and are already equipped with inertial sensors that are suitable for obtaining motion information. In the proposed approach, emotion is derived automatically from arm gestures and is used to query a music collection. We derive predictive models for valence and arousal from empirical data, gathered in an experimental setup where inertial data recorded from arm movements are coupled to musical emotion. Part of the experiment is a preliminary study confirming that human subjects are generally capable of recognizing affect from arm gestures. Model validation in the main study confirmed the predictive capabilities of the models.},
  author       = {Amelynck, Denis and Grachten, Maarten and van Noorden, Leon and Leman, Marc},
  issn         = {1949-3045},
  journal      = {IEEE TRANSACTIONS ON AFFECTIVE COMPUTING},
  keyword      = {EMOTIONAL RESPONSES,PERCEPTION,LIGHT DISPLAYS,POINT-LIGHT,MOVEMENT,EXPRESSIONS,FEATURES,IMAGERY,MODELS,CORTEX,Affect detection,expressive gestures,music retrieval,human computer interfaces},
  language     = {eng},
  number       = {2},
  pages        = {250--259},
  title        = {Toward E-motion based music retrieval: a study of affective gesture recognition},
  url          = {http://dx.doi.org/10.1109/T-AFFC.2011.39},
  volume       = {3},
  year         = {2012},
}

Chicago
Amelynck, Denis, Maarten Grachten, Leon van Noorden, and Marc Leman. 2012. “Toward E-motion Based Music Retrieval: a Study of Affective Gesture Recognition.” Ieee Transactions on Affective Computing 3 (2): 250–259.
APA
Amelynck, D., Grachten, M., van Noorden, L., & Leman, M. (2012). Toward E-motion based music retrieval: a study of affective gesture recognition. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 3(2), 250–259.
Vancouver
1.
Amelynck D, Grachten M, van Noorden L, Leman M. Toward E-motion based music retrieval: a study of affective gesture recognition. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING. 2012;3(2):250–9.
MLA
Amelynck, Denis, Maarten Grachten, Leon van Noorden, et al. “Toward E-motion Based Music Retrieval: a Study of Affective Gesture Recognition.” IEEE TRANSACTIONS ON AFFECTIVE COMPUTING 3.2 (2012): 250–259. Print.