The sound is captured by an array of 4 microphones embedded in a Microsoft Kinect v2 sensor. The data is sent to a patch in Cycling 74 Max 7 using a plugin called DPKinect created by Dale Phurrough. The color of the sound waves are directly correlated to the sound. The amplitude of the sound that is captured changes the brightness of the colour of the sound wave. The range of amplitude of the sounds being taken as input is between 0.000011247 Pa and 0.000112468 Pa (-5dB and 15dB). The distance of the performer to the camera changes the saturation of the colour of the sound wave. Signal values taken as input are between ~ -1000 and ~ 1000. We mapped the frequency of the note / MIDI key pressed to the hue of the colour of the sound wave. The frequency of the sound changes the hue of the colour of the sound wave. We took 121 hues in between 380 nm and 750 nm to represent the notes in between A-2 and A9, the frequencies in between 13.75 Hz and 14080 Hz and the MIDI keys in between 9 and 129. Sound waves are graphed on a jit.gl.graph and the sound wave graphs are created and sent away from the performers towards from the audience. One member of the audience may wear an Oculus Rift DK2 HMD to experience the scene in virtual reality. The user immersed in this environment and gains a sense of presence.
  The project is called Synesthesia. Synesthesia is defined as “a sensation produced in one modality when a stimulus is applied to another modality”. The project involves visualizing sound waves produced in the room in which it is being performed. The amplitude of the sound changes the brightness of the colour of the sound wave. The frequency of the sound changes the hue of the colour of the sound wave. Our motivation behind this project was that we wanted to experience what it would be like to have a limited version synesthesia. We will be using a remix of the song “Let It Go” mixed by Bearson originally wrote by James Bay.
The project has since been updated for the Oculus Rift.

You may also like

Back to Top