April 7-9, 2017
For two nights we stayed at the ASTRON radio telescope facility in Dwingelo, The Netherlands to develop the protocols for the COGITO performance envisioned by artist Daniela de Paulis. The primary goal of this meeting was to develop a paradigm in which high quality electroencephalography (EEG) is recorded while the participant is embedded in a virtual reality, and stream the EEG into space using the 25 meter satellite dish at the radio astronomy facility. The artistic objective of the EEG recording and transmission was to identify the subjective experience of the person viewing the immersive video evoking the Overview Effect and to communicate the intricate subjectivity of the human mind to a potential extraterrestrial listener. The team consisted of neuroscientists, artists, applied physicists, radio amateurs, and radio astronomers. Our initial challenge was to identify the different parts of the project and how they would all connect in a real-time, performative situation, during which the EEG is recorded, processed and send out into space at (almost) the same time.
Development of the interstellar EEG radio protocol
Our first challenge was to understand and develop a way in which we could transmit the EEG through a radio telescope. One observation was that we are limited to transmitting at a single specific frequency band, with a limited bandwidth of 2400Hz. More interesting was the challenge of how to encode the EEG into a single frequency band, instead of using a multiple FM transmissions at different frequencies, one for each electrode. There had been only one protocol developed for interstellar radio-transmission of EEG for the same art project in 2014. The protocol was developed by the students of Prof. Sennay Ghebreab at the University College Amsterdam, using an 14 channels Emotiv headset. The focus of this first protocol was mainly the conversion of the EEG signal into pre-recorded sound, that could be conveniently transmitted into space for a performance at the 50th Biennale in Ljubljana. The first documented time in which EEG was purposefully sent into space was in 1977, when it was etched on a golden plate and bolted onto a space probe. Pretty good signal to noise, but not very convenient, or fast. Just like the radio amateurs, we would need to work with (continuous, real-time) single-channel sound waves, that then get transmitted in the megahertz range. Additionally, we wanted to convey something of the spatial topography of the EEG, i.e. where the electrodes were recorded with respect to each other, allowing someone ‘at the other side’ to reconstruct an approximation of the same information that terrestrial neuroscientist would use to interpret EEG.
To contain the EEG into a single audio channel we developed the following protocol. First we transformed all the channels into the frequency domain. Since most EEG signals from the brain are contained in frequencies under 45Hz, we only retained 1-45Hz. We then concatenated all channels in the frequency domain, each channel being encoded into 75Hz bandwidth (note: 75*32=2400). Each 75Hz parts starts with a pure tone at 1Hz, then the 1-45Hz spectrum of the EEG channel, and then the x, y and z positions of the electrode is encoded by means of adding peaks in three successive 10 Hz bands. Thus, channel 1 retained it’s original spectrum at 1-45Hz, while channel two went to 76-121Hz, channel three went to 151-196, etc. Once all the encoding and concatenation done, we inverted the frequency representation back to a single audio channel with a sample rate of 44.1kHz. The information about the channel locations would allow, if not exact locations, at least to decode the 3-dimensional order of the electrodes with respect to each other. The orientation of the electrodes would not be possible to decode, but we accepted ambiguity about left-vs-right, top-vs-down, and front-vs-back.
It was then important to also make sure that transmitting the audio file into the megahertz range, and into space, would not distort the data given many sources of potential interference and noise. We were able to simulate these distortions, and found the data to be very well retained. Of course the best test would be to send the audio files between radio telescopes and reconstruct the data based on the received sound wave.
Developing the real-time transmission
Providing the radio telescope with real-time data, i.e. streaming data into space, requires us to provide it with an audio stream. It is an analogue system after all on which the radio amateurs will know what to do from there, employing their tricks of the trade to ensure the optimal quality of transmission. The trick was to develop everything that went before – from EEG to the encoded audio. Not coincidentally, we already had quite a system in place for real-time analysis of EEG, using the FieldTrip buffer which also forms the basis for the real-time EEG-to-sound architecture of the EEGsynth. The plan is to stream the EEG data from the GTec acquisition software to a FieldTrip buffer. This stream represents 32 channels at 250Hz. Using a to-be-developped EEGsynth software module we read this data, perform the audio-encoding and place that into another FieldTrip buffer. That stream represents a single channel at 44100 Hz, comparable to an audio signal. A third EEGsynth software module reads that data writes to the audio card. The analog output of the audio card goes to the transmitter.
The FieldTrip buffer for the 32-channel EEG data was put into place and we were able to stream the incoming EEG with no problems. We were able to perform the conversion to audio in an off-line scenario and will next implement it into the EEGsynth architecture. The conversion from audio file to audio stream through a sound card turned out to be a bit of a programming hassle, but nothing that we will not be able to deal with soon.
EEG recording during virtual reality
The moving images of the VR are still under development but already awe-inspiring, intense and arresting. And most of all, also absolutely gorgeous. But we needed to see whether we could combine the VR a 32 channel EEG recordings. The Oculus Rift is not very heavy and comfortably worn using two lateral straps and one on top the head. Although the straps obscure quite some space on the EEG cap, we were able to tightly but comfortably re-position the 32 electrodes on two strips on the left and right side of the head, as well as on occipital regions at the back of the head. Luckily the VR set did not create much – if any – interference on the EEG measurements. Gentle head movement also did not create significant artifacts, due to the robust EEG cap and VR headset attachment. This is great because although there are no actions possible, the beautiful VR environment just needs to be explored, at least by looking around.
The next steps
Having everyone together, and working with both the EEG, VR and radio telescope was very useful in developing the paradigm from the artistic to the technical and practical. We now need to work out some remaining technical/programming issues separately, but we are looking forward converging again soon and transmitting our first EEG into space.
Attendees
- Daniela de Paulis – University of Amsterdam, Amsterdam, NL
- Sandro Bocci
- Robert Oostenveld – Donders Institute, Radboud University, Nijmegen, NL
- Guillaume Dumas – Institut Pasteur, Paris, FR
- Stephen Whitmarsh – École normale supérieure, Paris, FR
- Michel Arts – Netherlands Institute for Radio Astronomy, Dwingelo, NL
- Michael Sanders – C.A. Muller Radioastronomie Station (CAMRAS)
- Jan van Muijlwijk – C.A. Muller Radioastronomie Station (CAMRAS)
The code for the project will be shared online, of which one of the resources will be https://github.com/eegsynth/ and the other https://github.com/fieldtrip/fieldtrip.