Video performance wall

Interactive audio visualizer

Program 6: "balls"-particle system reacting to increasing magnitudes in bass energy

Origins

Back in 2008, Firefly Makani was still an active musical group that would frequently play live shows. As the group was a duo, replicating the music on stage would be impossible due to the limited amount of hands at the disposal of the two performers.

We thus adopted an approach where songs were subdivided in several loopable recordings which could be controlled during playback (rearranged, extended, skipped, etc.).

The purpose of these loops was to keep time and key elements going while each member would play live instruments, making each performance unique where the songs deviated from their recorded versions. There is however, more to the aural aspect of the performance.

Playback control

Apart from being able to provide live melodic input and effects on top of the controllable recordings, an additional mechanism was developed in which all running audio streams - separated by rhythmic and melodic content - could be manipulated in realtime.

By using a short input buffer (into which all live audio output is recorded), the system could both retrigger small snippets while being able to return to the raw, unprocessed signal to maintain meter and tempo.

This part of the performance was developed by Fedde ten Berge using Max/MSP and ran on one laptop out of a two laptop setup, also capturing the activity of multiple MIDI controllers used by the performers, connected to the same device.

Networking audio and video

The second laptop in the setup was responsible for generating video output (either mathematically rendered shapes or by layering a collage of pre-recorded camera output) which would be projected onto the stage, requiring a bag full of proprietary-format-to-VGA-cables.

The laptops synced the performance state, keeping the pulse of the images synchronized with the tempo of the music as the video was manipulated by the same controllers controlling the audio streams.

Whenever a controller change was registered, a UDP packet was sent to the video laptop, which would interpret the message and take action, keeping the visual aspect of the show tied to the music, making a badly executed move both visually and aurally painfaul.

Unified aesthetic

While Firefly Makani was the creative output of two individuals, our identity remained anonymous in favour of an insect themed lo-fi image, never displaying the names of the members, let alone their face.

This style was reflected in green and grey dithered images and through custom video recordings of insects crawling over computer hardware, jellyfish tentacles floating in water, and similar goodness.

Above imagery reflected in the cover art, the (now no longer available) website and subsequently in the video performance wall, tying the visual identity over all media and performance outlets of the group together.

In case of the performance wall, In time to music performed live.

Program 2

Program "blitter" reacting to increasing glitches in audio playback

Particularly particle...ey

The available visualizations were split into several "programs", which could be changed at runtime via a controller directive.

Each of the programs would start in a neutral state, always displaying moving content. Programs were thus interchangeable and did not require a linear progression. The state could change by reacting to the controllers / changes in music, for instance: wild screeching sounds were visualised as a violent explosion of particles or by speeding up or reverting a video feed.

Additionally, programs could be overlaid on top of another, which was a trick to build intensity throughout a performance.

Uniform code base

The source of the video performance would reuse the code used in the composition of the music (e.g. the Max/MSP plugins) as well as that of the website which displayed the usual (in this case, vague) information over a constantly changing video background.

Visitors of the website would thus get a sneak peek at new imagery while the performances would tie in with the overall theme. All particle systems were however unique to the video performance as the particle behaviour was tied to the live controller instruments.

Languages and technologies used:

Max/MSP, ActionScript 3, Adobe AIR, C.

Five meter wide epilepsy

Firefly Makani opening for Otto von Schirach @ Nieuwe Nor, 2009.