A creative approach to video, audio and technology that brings life to ideas.



Androdes is a live audio-visual electronic music show, performed against the backdrop of projected CGI visuals.

The live show is performed on a console which integrates custom built VDrums, keyboards, sample pads, IR sensors and plenty of LEDs.

It's my responsibility to design, program and maintain these elements both in production and in live performance.



Electronic music producer and creative visionary Sartory contacted me about creating a console which would allow her to perform her music and animated story line live on stage.

The playable elements include custom-built VDrums, a MIDI keyboard, a Novation Launchpad and a Leap Motion IR sensor. Each element influences the live show in multiple ways. For example, if the VDrums are triggering drum samples, they are also lighting up the LEDs within their support structure. This is so that each performance element produces multi-sensory information for the audience.

The system uses a number of pieces of software to route MIDI, DMX and OSC. However, the performance elements all communicate via MIDI and even control LED elements such as the VDrums and the Helmet via MIDI. This makes the relationship between the live performance of the music, i.e hitting drums & playing keys etc, and the LED feedback more direct.

(Information on multi-sensory performance theory is detailed at the bottom of this page)


EXPErience thEORY

When a rock band plays, the audience can quickly couple the actions of the band and the musical output. They can see a drum being hit and sound be produced. The harder it's hit, the louder it sounds. They are able to ascertain that the music they hear is being played live.

In live electronic music, there can be a dissociation between the performer's actions and the output received by the audience. This is because the relationship electronic music producers have with their musical output is non-physical. Instead of hitting a drum which physically produces an acoustic sound, they press a button which triggers a recorded or synthesised sound. The physical action isn't directly related to the perceived output, that same button could make any sound.  



If the audience can't relate what they see a live performer do physically to the music they are hearing, their experience of the music isn't much different to that of recorded music being played back. The common jibe is "you never know if a laptop musician is just checking their emails".

One solution to this is to provide visual output which is directly relatable to the audio e.g hit a pad on the left of a stage which triggers audio from the left of the stage and produce light feedback from LEDs within the pads. The audience can see a physical action, hear audio feedback and see a coupled visual response.

Combing audio and a directly related (but not necessarily true) visual accompaniment creates a wholly different experience from receiving audio on it's own (see mcGurk effect). So by combining the physical performance of hitting a drum pad, LED light feedback from said drum pad and the accompanying sound of a drum, we are able to give the audience an altogether richer, multi-sensory experience.