All posts by Martin

Martin Franklin is a digital artist & musician, curator, online broadcaster and founder of the Sound:Space sound art symposium. His current work combines sound with other media in the digital domain, often resulting in spatial installation pieces. As solo artist and former leader of Ambient trio, TUU, his many recorded works have been released around the Globe. Performances and exhibitions have included Inpact Festival, Estonia, Pixelpops, WOMAD, Cybersonica, ICA London and Sonic Arts Research Centre, Belfast.


modern art oxford installationModern Art Oxford, September 200836th-street-text





I worked on the MyWorld strand of this major project for Oxfordshire Education Department for a year from September 2007. The final outcomes of my collaborative residency working with two primary schools, students from Wood Green School in Oxfordshire, and Carver Centre for Arts & Technology, Baltimore, was shown at Modern Art Oxford during September 2008.

mf-wg-01I approached this project as a developmental process, only knowing that I wanted to collect material and use it to contrast the online and physical environments that surrounded the project hosts in Oxfordshire and Baltimore – but not knowing exactly how.

As a “resident” artist in the schools, I tried to introduce as many new techniques and possibilities to the students as I could, leaving them to pick up and develop those techniques that were of most interest to them.

Practically, we went out on several field trips to significant locations in the lives of the students and recorded audio, video and still images from these spaces.

mf-wg-13Two distinct works came out from this material, the first being a large two channel moving image piece, assembled from panoramic still photographs and manipulated location recordings from equivalent places in the UK and USA. We set up video conferences between the students from Wood Green and Carver College to discuss the work, and finally swapped the material that we had gathered. I worked on the huge task of assembling the material with four students from Wood Green School, who elected to put in extra time to complete the necessary steps.

Video Extract from “UK Locations” channel:

[flash w=440 h=330 f={image=}]

sound-mapThe second piece, used similar source material but constructed as two “sound-maps” where visitors could listen through headphones and connect an audio cable into several sockets cut into a map of our route, each different socket providing access to recordings taken at each of the significant locations.

Here are a couple of audio examples from the Boar’s Hole, a resonant brick tunnel and stream, running under the main rail line to London in Cholsey, West Oxfordshire:

Inside the Boar’s Hole

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

The Underground Stream

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Project Archive at Oxfordshire County Education Dept.

Bindu Point

Martin FranklinMotion triggered audio/video performance

For 18 months from mid 2004 through to the end of 2005, I worked with performer,Lee Adams on this project after being awarded a research and development grant from Arts Council England.

The basic concept was to use video motion tracking to produce a responsive sound field, and as we developed the project, this became a sound source that I could improvise with and shape using various software and hardware controls.

Lee AdamsRather than being a purely academic exercise, we sought to produce an engaging, dynamic performance piece as well as explore some of the possiblities of using digital media in the performance arena.

This was my first large project using Max/MSP to build a software system, but before we go into the technical steps, here’s a movie of the performance to give the idea of the experience.

Video Downloads:

Performance at Sonorities Festival, SARC, Belfast (QuickTime MOV 7.9mb)
Early performance video edited for ePerformance & Plugins Festival, Sydney, Aus. (QuickTime MOV 28mb)

Generative Output

This images shows the interface of the final application, combining the generative output principle of the original system with some live performance controls that I could use to modify the sound output.

I used the Cyclops external from Cycling 74 to provide the motion tracking with one of the Apple iSight FW cameras viewing the performance space. The hotspots provided by Cyclops are configured to output a midi note number when movement is detected in any of five zones. Every two note values are then converted to their binary equivalents and become ‘parent’ values. These ‘parents’ are fed to a logical operator that responds by flipping the value to 1 if either input is nonzero, or otherwise returning zero. The outcome of this is that a unique ‘child’ value is generated, which is then fed to a Korg Triton synthesiser and the software Wavestation built into the app.

Control screen for the “Bindu Point” Max patch

I wanted to be able to get my hands on and mix the sound, so the outputs from the two sound sources were run through a small mixer with an effects unit patched in.The new performance controls enable me to add sustain, pitch shifting and to change the velocity of each note in real time. As the performance grows, the sound builds into an evolving field of layered drones that the performer simultaneously creates and responds to.

Lee Adams and video projectionDuring rehearsals, we added video feedback to the system, by projecting the trigger image back into the space and tuning the camera position to the point where it would respond to the changing light levels with chaotic, swirling images of the performance. This is probably no revelation to video artists, but in performance, it provided a spectacular visualisation of the audio output.

With this system, I wanted to address the issue of media-tised performance, so there were no screens or artificial barriers between us and the audience. It’s a purely experiential performance that makes use of technology to augment the human experience, and although the audio output certainly leads what I could do, it allows the human element to take the lead and initiate each stage, as well as respond to the audio and video outputs.

I began the first stage of this project in 2003 when I was studying for my MA and made a movie of Lee Adams using the first version of the system at Bow Arts Trust in East London. The movie is packaged as a skinned QuickTime, called “Telecommunication” (20mb ZIP)


After an initial foray into WordPress development for this year’s Sound:Space symposium web site, I’m starting to transition the whole Codetrip site over to the new platform.

I’ve just finished up a whole series of projects, so I’ll add news and restore some of the downloads from the old site as I go on.