Repository landing page

We are not able to resolve this OAI Identifier to the repository landing page. If you are the repository manager for this record, please head to the Dashboard and adjust the settings.

Gestural Control Of Wavefield synthesis

Abstract

We present a report covering our preliminary research on the control of spatial sound sources in wavefield synthesis through gesture based interfaces.After a short general introduction on spatial sound and few basic concepts on wavefield synthesis, we presents a graphical application called spAAce which let users to con- trol real-time movements of sound sources by drawing tra- jectories on a screen. The first prototype of this application has been developed bound to WFSCollider, an open-source software based on Supercollider which let users control wavefield synthesis. The spAAce application has been im- plemented using Processing, a programming language for sketches and prototypes within the context of visual arts, and communicates with WFSCollider through the Open Sound Control protocol. This application aims to create a new way of interaction for live performance of spatial composition and live electronics.In a subsequent section we present an auditory game in which players can walk freely inside a virtual acoustic en- vironment (a room in a commercial ship) while being ex- posed to the presence of several “enemies”, which the player needs to localise and eliminate by using a Nintendo Wi- iMote game controller to “throw” sounding objects towards them. Aim of this project was to create a gestural interface for a game based on auditory cues only, and to investigate how convolution reverberation can affects people’s percep- tion of distance in a wavefield synthesis setup environment

Similar works

This paper was published in VBN.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.