Ghent University Academic Bibliography

Advanced

A model-based sonification system for directional movement behavior

Pieter-Jan Maes UGent, Marc Leman UGent and Micheline Lesaffre UGent (2010)
abstract
Computational algorithms are presented that create a virtual model of a person’s kinesphere (i.e. a concept of Laban denoting the space immediately surrounding a person’s body and reachable by the upper limbs). This model is approached as a virtual sound object/instrument (VSO) that could be “played” by moving the upper limbs in particular directions. As such, it provides an alternative for visual qualitative movement analysis tools, like bar plots. This model-based sonification system emphasizes the role of interaction in sonification. Moreover, this study claims that the integration of intentionality and expressivity in auditory biofeedback interaction systems is necessary in order to make the sonification process more precise and transparent. A method is proposed – based on the embodied music cognition theory – that is able to do this without disclaiming the scientific, systematic principles underlying the process of sonification.
Please use this url to cite or link to this publication:
author
organization
year
type
conference
publication status
published
subject
keyword
expressive gesture, Sonification, human-computer interaction, embodiment
pages
4 pages
conference name
Interactive Sonification Workshop (ISon)
conference location
KTH, Stockholm, Sweden
conference start
2010-04-07
conference end
2010-04-07
language
English
UGent publication?
yes
classification
C1
copyright statement
I don't know the status of the copyright for this publication
id
925845
handle
http://hdl.handle.net/1854/LU-925845
date created
2010-04-15 09:49:58
date last changed
2016-12-19 15:36:59
@inproceedings{925845,
  abstract     = {Computational algorithms are presented that create a virtual model of a person{\textquoteright}s kinesphere (i.e. a concept of Laban denoting the space immediately surrounding a person{\textquoteright}s body and reachable by the upper limbs). This model is approached as a virtual sound object/instrument (VSO) that could be {\textquotedblleft}played{\textquotedblright} by moving the upper limbs in particular directions. As such, it provides an alternative for visual qualitative movement analysis tools, like bar plots. 
This model-based sonification system emphasizes the role of interaction in sonification. Moreover, this study claims that the integration of intentionality and expressivity in auditory biofeedback interaction systems is necessary in order to make the sonification process more precise and transparent. A method is proposed -- based on the embodied music cognition theory -- that is able to do this without disclaiming the scientific, systematic principles underlying the process of sonification.},
  author       = {Maes, Pieter-Jan and Leman, Marc and Lesaffre, Micheline},
  keyword      = {expressive gesture,Sonification,human-computer interaction,embodiment},
  language     = {eng},
  location     = {KTH, Stockholm, Sweden},
  pages        = {4},
  title        = {A model-based sonification system for directional movement behavior},
  year         = {2010},
}

Chicago
Maes, Pieter-Jan, Marc Leman, and Micheline Lesaffre. 2010. “A Model-based Sonification System for Directional Movement Behavior.” In .
APA
Maes, P.-J., Leman, M., & Lesaffre, M. (2010). A model-based sonification system for directional movement behavior. Presented at the Interactive Sonification Workshop (ISon).
Vancouver
1.
Maes P-J, Leman M, Lesaffre M. A model-based sonification system for directional movement behavior. 2010.
MLA
Maes, Pieter-Jan, Marc Leman, and Micheline Lesaffre. “A Model-based Sonification System for Directional Movement Behavior.” 2010. Print.