Nouvelle publication : SoundGuides, Adapting Continuous Auditory Feedback to Users

SoundGuides: Adapting Continuous Auditory Feedback to Users

Jules Françoise Ircam, Paris, France
Olivier Chapuis Univ Paris Sud, CNRS & INRIA, Orsay, France
Sylvain Hanneton Université Paris-Descartes, PARIS, France
Frédéric Bevilacqua IRCAM, Paris, France

We introduce SoundGuides, a user adaptable tool for auditory feedback on movement. The system is based on a interactive machine learning approach, where both gestures and sounds are first conjointly designed and conjointly learned by the system. The system can then automatically adapt the auditory feedback to any new user, taking into account the particular way each user performs a given gesture. SoundGuides is suitable for the design of continuous auditory feedback aimed at guiding users’ movements and helping them to perform a specific movement consistently over time. Applications span from movement-based interaction techniques to auditory-guided rehabilitation. We first describe our system and report a study that demonstrates a ‘stabilizing effect’ of our adaptive auditory feedback method.

Proceeding
CHI EA ’16 Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
Pages 2829-2836
ACM New York, NY, USA ©2016
table of contents ISBN: 978-1-4503-4082-3 doi>10.1145/2851581.2892420

Le lien vers la page de l’article ici : http://dl.acm.org/citation.cfm?id=2892420&CFID=621448898&CFTOKEN=84867349