Advanced search
1 file | 1.68 MB

Synchronizing multimodal recordings using audio-to-audio alignment: an application of acoustic fingerprinting to facilitate music interaction research

Joren Six (UGent) and Marc Leman (UGent)
Author
Organization
Abstract
Research on the interaction between movement and music often involves analysis of multi-track audio, video streams and sensor data. To facilitate such research a framework is presented here that allows synchronization of multimodal data. A low cost approach is proposed to synchronize streams by embedding ambient audio into each data-stream. This effectively reduces the synchronization problem to audio-to-audio alignment. As a part of the framework a robust, computationally efficient audio-to-audio alignment algorithm is presented for reliable synchronization of embedded audio streams of varying quality. The algorithm uses audio fingerprinting techniques to measure offsets. It also identifies drift and dropped samples, which makes it possible to find a synchronization solution under such circumstances as well. The framework is evaluated with synthetic signals and a case study, showing millisecond accurate synchronization.
Keywords
Multimodal data synchronization, Audio-to-audio-alignment, Audio fingerprinting, Music performance research, Digital signal processing

Downloads

  • (...).pdf
    • full text
    • |
    • UGent only
    • |
    • PDF
    • |
    • 1.68 MB

Citation

Please use this url to cite or link to this publication:

Chicago
Six, Joren, and Marc Leman. 2015. “Synchronizing Multimodal Recordings Using Audio-to-audio Alignment: An Application of Acoustic Fingerprinting to Facilitate Music Interaction Research.” Ed. Jean-Claude Martin. Journal on Multimodal User Interfaces 9 (3): 223–229.
APA
Six, J., & Leman, M. (2015). Synchronizing multimodal recordings using audio-to-audio alignment: an application of acoustic fingerprinting to facilitate music interaction research. (J.-C. Martin, Ed.)JOURNAL ON MULTIMODAL USER INTERFACES, 9(3), 223–229.
Vancouver
1.
Six J, Leman M. Synchronizing multimodal recordings using audio-to-audio alignment: an application of acoustic fingerprinting to facilitate music interaction research. Martin J-C, editor. JOURNAL ON MULTIMODAL USER INTERFACES. Springer; 2015;9(3):223–9.
MLA
Six, Joren, and Marc Leman. “Synchronizing Multimodal Recordings Using Audio-to-audio Alignment: An Application of Acoustic Fingerprinting to Facilitate Music Interaction Research.” Ed. Jean-Claude Martin. JOURNAL ON MULTIMODAL USER INTERFACES 9.3 (2015): 223–229. Print.
@article{6873558,
  abstract     = {Research on the interaction between movement and music often involves analysis of multi-track audio, video streams and sensor data. To facilitate such research a framework is presented here that allows synchronization of multimodal data. A low cost approach is proposed to synchronize streams by embedding ambient audio into each data-stream. This effectively reduces the synchronization problem to audio-to-audio alignment. As a part of the framework a robust, computationally efficient audio-to-audio alignment algorithm is presented for reliable synchronization of embedded audio streams of varying quality. The algorithm uses audio fingerprinting techniques to measure offsets. It also identifies drift and dropped samples, which makes it possible to find a synchronization solution under such circumstances as well. The framework is evaluated with synthetic signals and a case study, showing millisecond accurate synchronization.},
  author       = {Six, Joren and Leman, Marc},
  editor       = {Martin, Jean-Claude},
  issn         = {1783-7677},
  journal      = {JOURNAL ON MULTIMODAL USER INTERFACES},
  keyword      = {Multimodal data synchronization,Audio-to-audio-alignment,Audio fingerprinting,Music performance research,Digital signal processing},
  language     = {eng},
  number       = {3},
  pages        = {223--229},
  publisher    = {Springer},
  title        = {Synchronizing multimodal recordings using audio-to-audio alignment: an application of acoustic fingerprinting to facilitate music interaction research},
  url          = {http://dx.doi.org/10.1007/s12193-015-0196-1},
  volume       = {9},
  year         = {2015},
}

Altmetric
View in Altmetric
Web of Science
Times cited: