Information
Universal Assembly Unit has now disassembled (2013-2020). Special thanks to our collaborators and commissioners who have come with us on our journey over the years. Feel free to get in touch about past projects.
For a concert performed by the London Contemporary Orchestra at the Barbican in October 2018, we created a unique audio-visual installation using a newly developed artificial neural network (GAN) which could create live visuals to music in real-time. The orchestra were encased behind a translucent mesh, acting as a luminous and permeable membrane between the world of the audience and the players.
We selected the imagery and videos to train the AI algorithm offline, so it would learn the aesthetics and movements of the natural world, from thunderstorms and underwater kelp forests to volcanoes and prehistoric caves. Responding in real-time to the live performance, the AI would be able to generate or ‘hallucinate’ original imagery to create an artificial synaesthetic experience.