Advanced search
1 file | 2.24 MB

From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface

Pieter-Jan Maes (UGent) , Marc Leman (UGent) , Micheline Lesaffre (UGent) , Michiel Demey (UGent) and Dirk Moelants (UGent)
Author
Organization
Abstract
This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context.
Keywords
multimodal interface, mapping · inertial sensing technique, usability testing, IPEMexpressive, IPEMapplication

Downloads

  • Maes PJ Paper JMUI FINAL.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 2.24 MB

Citation

Please use this url to cite or link to this publication:

Chicago
Maes, Pieter-Jan, Marc Leman, Micheline Lesaffre, Michiel Demey, and Dirk Moelants. 2010. “From Expressive Gesture to Sound: The Development of an Embodied Mapping Trajectory Inside a Musical Interface.” Ed. Ginevra Castellano. Journal of Multimodal User Interfaces 3 (1-2): 67–78.
APA
Maes, P.-J., Leman, M., Lesaffre, M., Demey, M., & Moelants, D. (2010). From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface. (G. Castellano, Ed.)JOURNAL OF MULTIMODAL USER INTERFACES, 3(1-2), 67–78.
Vancouver
1.
Maes P-J, Leman M, Lesaffre M, Demey M, Moelants D. From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface. Castellano G, editor. JOURNAL OF MULTIMODAL USER INTERFACES. 2010;3(1-2):67–78.
MLA
Maes, Pieter-Jan, Marc Leman, Micheline Lesaffre, et al. “From Expressive Gesture to Sound: The Development of an Embodied Mapping Trajectory Inside a Musical Interface.” Ed. Ginevra Castellano. JOURNAL OF MULTIMODAL USER INTERFACES 3.1-2 (2010): 67–78. Print.
@article{905787,
  abstract     = {This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context.},
  author       = {Maes, Pieter-Jan and Leman, Marc and Lesaffre, Micheline and Demey, Michiel and Moelants, Dirk},
  editor       = {Castellano, Ginevra },
  issn         = {1783-7677},
  journal      = {JOURNAL OF MULTIMODAL USER INTERFACES},
  keyword      = {multimodal interface,mapping {\textperiodcentered} inertial sensing technique,usability testing,IPEMexpressive,IPEMapplication},
  language     = {eng},
  number       = {1-2},
  pages        = {67--78},
  title        = {From expressive gesture to sound: the development of an embodied mapping trajectory inside a musical interface},
  url          = {http://dx.doi.org/10.1007/s12193\unmatched{2010}009\unmatched{2010}0027-3},
  volume       = {3},
  year         = {2010},
}

Altmetric
View in Altmetric
Web of Science
Times cited: