Interactive sonification of expressive hand gestures on a handheld device
Autor: | Roberto Bresin, Marco Fabiani, Gaël Dubus |
---|---|
Rok vydání: | 2011 |
Předmět: |
Computer and Information Sciences
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g. HCI) Computer science Speech recognition 050105 experimental psychology 03 medical and health sciences 0302 clinical medicine Phone Psychology 0501 psychology and cognitive sciences Articulation (music) Sonification Psykologi Emotional hand gestures 05 social sciences Information and Computer Science Data- och informationsvetenskap Automatic music performance Human Computer Interaction Människa-datorinteraktion (interaktionsdesign) Human-Computer Interaction Mobile phone Signal Processing A priori and a posteriori Mobile device 030217 neurology & neurosurgery Gesture |
Zdroj: | Journal on Multimodal User Interfaces. 6:49-57 |
ISSN: | 1783-8738 1783-7677 |
DOI: | 10.1007/s12193-011-0076-2 |
Popis: | We present here a mobile phone application called MoodifierLive which aims at using expressive music performances for the sonification of expressive gestures through the mapping of the phone’s accelerometer data to the performance parameters (i.e. tempo, sound level, and articulation). The application, and in particular the sonification principle, is described in detail. An experiment was carried out to evaluate the perceived matching between the gesture and the music performance that it produced, using two distinct mappings between gestures and performance. The results show that the application produces consistent performances, and that the mapping based on data collected from real gestures works better than one defined a priori by the authors. QC 20120809. Updated from submitted to published. SAME |
Databáze: | OpenAIRE |
Externí odkaz: |