Advanced search
1 file | 2.03 MB

A Framework for music-based interactive sonification

Nuno Correia Da Silva Diniz (UGent) , Alexander Deweppe (UGent) , Michiel Demey (UGent) and Marc Leman (UGent)
Author
Organization
Abstract
In this paper, a framework for interactive sonification is introduced. It is argued that electroacoustic composition techniques can provide a methodology for structuring and presenting multivariable data through sound. Furthermore, an embodied music cognition driven interface is applied to provide an interactive exploration of the generated output. The motivation and theoretical foundation for this work are presented as well as the framework’s implementation and an exploratory use case.

Downloads

  • ICAD2010 NDiniz.pdf
    • full text
    • |
    • open access
    • |
    • PDF
    • |
    • 2.03 MB

Citation

Please use this url to cite or link to this publication:

Chicago
Correia Da Silva Diniz, Nuno, Alexander Deweppe, Michiel Demey, and Marc Leman. 2010. “A Framework for Music-based Interactive Sonification.” In 16th International Conference on Auditory Display, Proceedings.
APA
Correia Da Silva Diniz, N., Deweppe, A., Demey, M., & Leman, M. (2010). A Framework for music-based interactive sonification. 16th International Conference on Auditory Display, Proceedings. Presented at the 16th International Conference on Auditory Display (ICAD-2010).
Vancouver
1.
Correia Da Silva Diniz N, Deweppe A, Demey M, Leman M. A Framework for music-based interactive sonification. 16th International Conference on Auditory Display, Proceedings. 2010.
MLA
Correia Da Silva Diniz, Nuno, Alexander Deweppe, Michiel Demey, et al. “A Framework for Music-based Interactive Sonification.” 16th International Conference on Auditory Display, Proceedings. 2010. Print.
@inproceedings{926557,
  abstract     = {In this paper, a framework for interactive sonification is introduced. It is argued that electroacoustic composition techniques can provide a methodology for structuring and presenting multivariable data through sound. Furthermore, an embodied music cognition driven interface is applied to provide an interactive exploration of the generated output. The motivation and theoretical foundation for this work are presented as well as the framework{\textquoteright}s implementation and an exploratory use case.},
  author       = {Correia Da Silva Diniz, Nuno and Deweppe, Alexander and Demey, Michiel and Leman, Marc},
  booktitle    = {16th International Conference on Auditory Display, Proceedings},
  language     = {eng},
  location     = {George Washington University, Washington DC, USA},
  title        = {A Framework for music-based interactive sonification},
  year         = {2010},
}