Gestural control of wavefield synthesis

Publication Type:

Conference Paper


Proceedings of the Sound and Music Computing Conference 2016, SMC 2016, Hamburg, Germany (2016)





We present a report covering our preliminary research on the control of spatial sound sources in wavefield synthesis through gesture based interfaces. After a short general introduction on spatial sound and few basic concepts on wavefield synthesis, we presents a graphical application called spAAce  which let users to control real-time movements of sound sources by drawing trajectories on a screen. The first prototype of this application has been developed bound to WFSCollider, an open-source software based on Supercollider which let users control wavefield synthesis. The spAAce  application has been implemented using Processing, a programming language for sketches and prototypes within the context of visual arts, and communicates with WFSCollider through the Open Sound Control protocol. This application aims to create a new way of interaction for live performance of spatial composition and live electronics. In a subsequent section we present an auditory game in which players can walk freely inside a virtual acoustic environment (a room in a commercial ship) while being exposed to the presence of several “enemies”, which the player needs to localise and eliminate by using a Nintendo WiiMote game controller to “throw” sounding objects towards them. Aim of this project was to create a gestural interface for a game based on auditory cues only, and to investigate how convolution reverberation can affects people’s perception of distance in a wavefield synthesis setup environment.

SMC2016_submission_29.pdf890.94 KB
SMC paper: