Ghent University Academic Bibliography

Advanced

From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface

Pieter-Jan Maes UGent, Marc Leman UGent, Micheline Lesaffre UGent, Michiel Demey and Dirk Moelants UGent (2010) JOURNAL OF MULTIMODAL USER INTERFACES. 3(1-2). p.67-78
abstract
This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context.
Please use this url to cite or link to this publication:
author
organization
year
type
journalArticle (original)
publication status
published
subject
keyword
multimodal interface, mapping · inertial sensing technique, usability testing, IPEMexpressive, IPEMapplication
journal title
JOURNAL OF MULTIMODAL USER INTERFACES
J. Multimodal User Interfaces
editor
Ginevra Castellano
volume
3
issue
1-2
issue title
From expressive gesture to sound: The development of an embodied mapping trajectory inside a musical interface
pages
67 - 78
Web of Science type
Article
Web of Science id
000208480100006
ISSN
1783-7677
DOI
10.1007/s12193‐009‐0027-3
language
English
UGent publication?
yes
classification
A1
copyright statement
I have transferred the copyright for this publication to the publisher
VABB id
c:vabb:299359
VABB type
VABB-1
id
905787
handle
http://hdl.handle.net/1854/LU-905787
date created
2010-03-16 11:38:39
date last changed
2016-12-19 15:46:38
@article{905787,
  abstract     = {This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context.},
  author       = {Maes, Pieter-Jan and Leman, Marc and Lesaffre, Micheline and Demey, Michiel and Moelants, Dirk},
  editor       = {Castellano, Ginevra },
  issn         = {1783-7677},
  journal      = {JOURNAL OF MULTIMODAL USER INTERFACES},
  keyword      = {multimodal interface,mapping {\textperiodcentered} inertial sensing technique,usability testing,IPEMexpressive,IPEMapplication},
  language     = {eng},
  number       = {1-2},
  pages        = {67--78},
  title        = {From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface},
  url          = {http://dx.doi.org/10.1007/s12193\unmatched{2010}009\unmatched{2010}0027-3},
  volume       = {3},
  year         = {2010},
}

Chicago
Maes, Pieter-Jan, Marc Leman, Micheline Lesaffre, Michiel Demey, and Dirk Moelants. 2010. “From Expressive Gesture to Sound: The Development of an Embodied Mapping Trajectory Inside a Musical Interface.” Ed. Ginevra Castellano. Journal of Multimodal User Interfaces 3 (1-2): 67–78.
APA
Maes, P.-J., Leman, M., Lesaffre, M., Demey, M., & Moelants, D. (2010). From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface. (G. Castellano, Ed.)JOURNAL OF MULTIMODAL USER INTERFACES, 3(1-2), 67–78.
Vancouver
1.
Maes P-J, Leman M, Lesaffre M, Demey M, Moelants D. From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface. Castellano G, editor. JOURNAL OF MULTIMODAL USER INTERFACES. 2010;3(1-2):67–78.
MLA
Maes, Pieter-Jan, Marc Leman, Micheline Lesaffre, et al. “From Expressive Gesture to Sound: The Development of an Embodied Mapping Trajectory Inside a Musical Interface.” Ed. Ginevra Castellano. JOURNAL OF MULTIMODAL USER INTERFACES 3.1-2 (2010): 67–78. Print.