Motor adaptation during a sound oriented task (NCM 2017, poster publication)

This poster will be presented in Dublin during the 2017 annual meeting of the Society for the Neural Control of Movement (NCM 2017).

Motor adaptation during a sound oriented task

Eric O. Boyer1,2, Frederic Bevilacqua2, Sylvain Hanneton3, Agnes Roby-Brami1,
1 ISIR – CNRS UMR 7222, UPMC, 2: IRCAM – STMS-CNRS, UPMC, 3: LPP – CNRS UMR 8242, University Paris Descartes, Paris, France

Introduction. Movement sonification systems appear promising for sensori-motor learning in providing users with auditory feedback of their own movements [1]. However, research on sonification for sensori-motor learning has been mainly directed toward “movement oriented tasks” where the instruction and the attention is put on the movement itself. In contrast, the aim of the present study was to test a situation were the instruction and attention is given to the sound, that we call a “sound-oriented task”.
The sonification mapping relied on the metaphor of friction sounds produced by drawing movements. We focused on the drawing of ellipses which is characterized by a well known invariant velocity-curvature relationship [2,3]. The replay of friction sounds (registered or synthesized) can evoke the shape of the drawings [4] and induce a sensori-motor perceptive bias on the reproduction of visual motion [5] .
Elaborating further on these bases, we tested the effect of on-line sonification with friction-like sounds on the kinematics and the shape of elliptical drawing movement. The mapping was implemented as a band pass filter whose center frequency varied linearly with the velocity of the movement [3,4]. We analyzed the motor adaptation of the drawing movements when the subject had the instruction to maintain a constant sonification pattern while the frequency of the filter changed without his/her knowledge.
Our hypothesis was that the alteration of the sonification mapping would induce temporal and/or spatial adaptation of the movement.

(see the poster for methods and results)

Discussion. The motor adaptation tended to compensate for the changes in sound feedback induced by the changes in the sound-movement mapping. This demonstrate that the participant could adapt their movement to the “sound oriented” task. The adaptation was manifested by modifications of the kinematics. There was a tendency for increase of frequency and decreased size of the drawing in the control situation. In addition, the movement was faster with larger movements when the gain of the mapping was increased and slower with smaller movements when it decreased. The global shape and orientation of the ellipse was not modified in 2D.
This demonstrates that the participant privileged the stability of the geometrical shape and adapted their velocity in order to satisfy the instruction to keep the sonification pattern constant. The increase in velocity was more due to a change in frequency while the decrease was more due to a shrinking of the shape, suggesting different movement regimen [e.g. alternative versus discrete, 7]. The modification of the angle of the ellipse during the experiment in 3D but not in 2D was probably due to greater inertial constraints as shown by Pfann et al [8]. Participants who were instructed to draw circles with shoulder-elbow movements made ellipses with increasing eccentricity when the velocity increased (order of magnitude: 1m/s). In addition, the elongation of the ellipse was in the direction of least inertia. A similar effect was also observed for handwriting-like movements (similar to our 2D task) by Dounskaia et al. [9] but for a much higher velocity regimen (instruction level “as fast as possible”, 0.34m/s), when the velocity we used corresponds to their “self paced” level).

Conclusion. This study demonstrates that movement sonification can be used i) to induce implicit motor adaptation in both planar and 3D movements and ii) to control the direction and magnitude of this adaptation through mapping parameters modification.

 

Nouvelle publication : Investigating three types of continuous auditory feedback in visuo‑manual tracking

Investigating three types of continuous auditory feedback  in visuo‑manual tracking

Éric O. Boyer, Frédéric Bevilacqua, Patrick Susini et Sylvain Hanneton
Exp Brain Res – DOI 10.1007/s00221-016-4827-x
(electronic publication, paper soon)

Abstract

The use of continuous auditory feedback for motor control and learning is still understudied and deserves more attention regarding fundamental mechanisms and applications. This paper presents the results of three experiments studying the contribution of task-error, and user-related sonification to visuo-manual tracking and assessing its benefits on sensorimotor learning. First results show that sonification can help decreasing the tracking error, as well as increasing the energy in participant’s movement. In the second experiment, when alternating feedback presence, the user-related sonification did not show feedback dependency effects, contrary to the error and task-related feedback. In the third experiment, a reduced exposure of 50% diminished the positive effect of sonification on performance, whereas the increase of the average energy with sound was still significant. In a retention test performed on the next day without auditory feedback, movement energy was still superior for the groups previously trained with the feedback. Although performance was not affected by sound, a learning effect was measurable in both sessions and the user-related group improved its performance also in the retention test. These results confirm that a continuous auditory feedback can be beneficial for movement training and also show an interesting effect of sonification on movement energy. User-related sonification can prevent feedback dependency and increase retention. Consequently, sonification of the user’s own motion appears as a promising solution to support movement learning with interactive feedback.

Keywords: Tracking · Auditory feedback · Sensorimotor learning · Sound · Interaction

Nouvelle publication : Sensori-Motor Learning with Movement Sonification: Perspectives from Recent Interdisciplinary Studies

 Sensori-Motor Learning with Movement Sonification: Perspectives from Recent Interdisciplinary Studies

Access to the full text HERE

This article reports on an interdisciplinary research project on movement sonification for sensori-motor learning. First, we describe different research fields which have contributed to movement sonification, from music technology including gesture-controlled sound synthesis, sonic interaction design, to research on sensori-motor learning with auditory-feedback. In particular, we propose to distinguish between sound-oriented tasks and movement-oriented tasks in experiments involving interactive sound feedback. We describe several research questions and recently published results on movement control, learning and perception. In particular, we studied the effect of the auditory feedback on movements considering several cases: from experiments on pointing and visuo-motor tracking to more complex tasks where interactive sound feedback can guide movements, or cases of sensory substitution where the auditory feedback can inform on object shapes. We also developed specific methodologies and technologies for designing the sonic feedback and movement sonification. We conclude with a discussion on key future research challenges in sensori-motor learning with movement sonification. We also point out toward promising applications such as rehabilitation, sport training or product design.

Keywords: sonification, movement, learning, sensori-motor, sound design, interactive systems

Citation: Bevilacqua F, Boyer EO, Françoise J, Houix O, Susini P, Roby-Brami A and Hanneton S (2016) Sensori-Motor Learning with Movement Sonification: Perspectives from Recent Interdisciplinary Studies. Front. Neurosci. 10:385. doi: 10.3389/fnins.2016.00385

Nouvelle publication : SoundGuides, Adapting Continuous Auditory Feedback to Users

SoundGuides: Adapting Continuous Auditory Feedback to Users

Jules Françoise Ircam, Paris, France
Olivier Chapuis Univ Paris Sud, CNRS & INRIA, Orsay, France
Sylvain Hanneton Université Paris-Descartes, PARIS, France
Frédéric Bevilacqua IRCAM, Paris, France

We introduce SoundGuides, a user adaptable tool for auditory feedback on movement. The system is based on a interactive machine learning approach, where both gestures and sounds are first conjointly designed and conjointly learned by the system. The system can then automatically adapt the auditory feedback to any new user, taking into account the particular way each user performs a given gesture. SoundGuides is suitable for the design of continuous auditory feedback aimed at guiding users’ movements and helping them to perform a specific movement consistently over time. Applications span from movement-based interaction techniques to auditory-guided rehabilitation. We first describe our system and report a study that demonstrates a ‘stabilizing effect’ of our adaptive auditory feedback method.

Proceeding
CHI EA ’16 Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems
Pages 2829-2836
ACM New York, NY, USA ©2016
table of contents ISBN: 978-1-4503-4082-3 doi>10.1145/2851581.2892420

Le lien vers la page de l’article ici : http://dl.acm.org/citation.cfm?id=2892420&CFID=621448898&CFTOKEN=84867349

Nouvelle publication : From ear to hand: the role of the auditory-motor loop in pointing to an auditory source

Boyer EO, Babayan BM, Bevilacqua F, Noisternig M, Warusfel O, Roby-Brami A, Hanneton S and Viaud-Delmon I (2013) From ear to hand: the role of the auditory-motor loop in pointing to an auditory source. Front. Comput. Neurosci. 7:26. doi: 10.3389/fncom.2013.00026

PDF : http://www.frontiersin.org/Journal/DownloadFile.ashx?pdf=1&FileId=114799&articleId=43100&Version=1&ContentTypeId=21&FileName=fncom-07-00026.pdf

Lien vers l’article en accès libre : http://www.frontiersin.org/Journal/Abstract.aspx?ART_DOI=10.3389/fncom.2013.00026&name=computational_neuroscience

Studies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed toward unseen auditory targets located in a horizontal plane in front of them. Positions and movements of the hand were measured by a six infrared camera tracking system. In one condition, we assessed the role of auditory information about target position in correcting the trajectory of the hand. To accomplish this, the duration of the target presentation was varied. In another condition, subjects received continuous auditory feedback of their hand movement while pointing to the auditory targets. Online auditory control of the direction of pointing movements was assessed by evaluating how subjects reacted to shifts in heard hand position. Localization errors were exacerbated by short duration of target presentation but not modified by auditory feedback of hand position. Long duration of target presentation gave rise to a higher level of accuracy and was accompanied by early automatic head orienting movements consistently related to target direction. These results highlight the efficiency of auditory feedback processing in online motor control and suggest that the auditory system takes advantages of dynamic changes of the acoustic cues due to changes in head orientation in order to process online motor control. How to design an informative acoustic feedback needs to be carefully studied to demonstrate that auditory feedback of the hand could assist the monitoring of movements directed at objects in auditory space.