On Septembre 17th, the ANR funded BBDMI project opened their doors to its laboratory at MSH Paris Nord, as part of the European Heritage days. We were able to showcase our latest developments, and we did so with an ambitious setup: In sessions of about a dozen people each, we allowed guests to measure their electromyography (EMG) and electroencephalography (EEG), and explore real-time multi-agent hybrid sound synthesis. For EMG we provided both the Myo armband (now bought by Facebook and out of production), and our own open-source EAVI board developed by our Atau Tanaka, at the Embodied AudioVisual Interaction (EAVI) research group at Goldsmiths. To record brain activity, we provided the research-grade wireless Mentalab Explore EEG system, which we wrote about earlier. The Mentalab Explore has been kindly provided by Mentalab, who support us in the development of an open-source EEG-controlled musical instrument.
EEG was transmitted by LSL using Mentalab’s latest GUI. The EEGsynth then calculated user-specific alpha band brain in real-time, and used OSC to stream these as modulation signals to our sound synthesis platform in MAX/MSP. The EMG data from the EAVI board was streamed as MIDI, while a Myo module was used to extract the data from the Myo armbands.
These technical details were of no concern to the audience, who were presented by a didactive and experiential listening experience, in which we explored body and brain music mapping through different styles of sound synthesis. The event started by an introduction by the director of MSH Paris Nord – coordinator of the BBDMI project – who set out the larger cultural and musical context of the project. Atau Tanaka then explained the different biophysical measure that were would be using, after which David Fierro, Francesco Dimaggio, and I connected the volunteers. I used the Mentalab Ag-AgCl ring electrodes with electrode gel. Ag-AgCl electrodes are well known to generate the best data quality, far superior to any dry electrode setup in terms of stability and signal to noise. Setup is also very quick, if you know what you are doing – and takes me not more than a couple of minutes. People often forget that setting up a dry-electrode system requires a lot more fiddling around, often neccecitates preparing the skin, wiggling electrodes around the hair, etc. Even with one participant with a lot of hair, I just used some more gel and got a great signal. For the EAVI board we used disposable gel electrodes, which also secure a stable signal with high signal-to-noise.
When everyone was set up, David guided the voluteers and audience through several exercises of sound synthesis and their modulation by EEG and EMG. We started with the raw sonification of bioelectric signals in which we listerend to the raw EMG signal, distributed sonically through space in the 3D speaker setup. We then explored tibre, pitch, geometric and ambisonic diffusion, granulation and even “mixed music” in which voluteers used percussive instruments that were were modulated by the brain-body ensemble. We ended with a collective “concert” of all the volunteers, using machine-learning to map the audience’s bodies and brains into a collective embodied soundscape. The event was very well received, and in the Q&A sessions we could together explore all the different topics and reflections that it raised.
We were also very happy with how well the event went technically: The setup was robust and flexible, and allowed us to demonstrate many aspects of brain-body music. The data quality was as good as it gets. It was satifying that after introducing alpha activity the audience exclaimed a collective “Wow!” when large amplitude alpha activity appeared (see below!). It was also the first time that we could demonstrate the user friendly and flexible patching system in Max/MSP that Francesco had been developing, and which we will be releasing soon. In fact, our repository, with all the patches, objects, modules and documention will be released before the end of the year! So stay tuned!