Media & Entertainment – AR/VR
“Is the mind separated from the body? Is there any relation between our thoughts and our physiology?”
During this act we address this ancient philosophical question by using state-of- the-art technology to hack the brain & the body and translate their activity into a creative set of sound and images all related to our deepest feelings, emotions and intentions. Here, the brain acts as an audiovisual sampler and the body movements as an effect modulator. The mind will control audio and video samples organized in a mental playlist according to its activation or relaxation and the body will control the effects according to its position, acceleration and angle.
Technically we have used an EEG (electroencephalogram) Emotiv Epoc Headset and a prototype application named #FindingSomethingBondingSound (developed by our team) to acquire and translate the brain data; trigger the video clips located in a MaxMSP jitter patch and the sounds located in the Ableton Live music production software and make the integration and process the data coming from 2 R-IoT sensors (gestures sensors developed by the IRCAM, Centre Pompidou) that manipulate the sound and the video effects located in the Ableton Live music production software. The sounds and the videos clips projected were controlled by the mind according to a MaxMSP “concentration/meditation algorithm” (developed by our team). The sound and video effects (layers, velocity, pixels, motion graphics, pitch, pan) were MIDI mapped and controlled by the R-IoT sensors. The footages and sounds were all original and recorded by our team. We used one big projector with 6000 lumens and a normal Stereo PA.