This footage was shot on Wednesday 13th October, 2010 in the Brisbane Artist Run Initiative tent, situated in King George Square. Joel Stern asked me to provide some visuals for an improvised performance as part of the Other Film night at BARI. We discussed it initially as being a duet with Joe Musgrove however on the night we were accompanied by Joel and Josh Watson.
Following the Superflux performance discussed here, I wanted to take some aspects of the expanded cinema performance and incorporate them into my materials and performance aesthetic for this event. In the last few weeks i’ve been experimenting with the Canon 550D, specifically what affordances it provides for digital video recording when combined with access to a fairly full compliment of SLR features. On the morning of the event I spent 2 hours at the Mt Cootha Botanical Gardens collecting source footage, specifically macro footage of plants. Adding a 36mm Kenko Extension Tube to a 55-250mm zoom lens allowed me to shoot close-up footage from a variable distance of around 3-5ft. Adding in the 550D’s “movie crop” mode (which takes a 640×480 square and blows that up – effectively magnifying x5) allowed for extreme close ups which not only provide a unique view of the fine hairs and skin of the plants, but effectively denies the viewer instant identification. This connects in some way to acousmatic music, and the ideas of Pierre Schaeffer (and the sound object) which I believe have a lot to teach audiovisualists. The footage I capture is a macro-cosm taken away from its source in a manner comparable to the sound object in acousmatic composition. The intention was to generate an image that could be interpreted anew by the audience in relation to the sound that tracks it.
Another unintended joy is the effect of a narrow depth of focus on image clarity. Most of the pieces move in and out of focus. Where this would normally be detrimental, in my case this was perfect for my purposes both in regards to making it easier to mix footage (that is blurry) and also in the manner in which focus shift in and out. In reference to the Superflux performance, I wanted footage that I could manipulate, in real-time, much like the expanded cinema performers manipulate strips of film. The constant pulling in and out of focus suggested rhythmical patterns that could be manipulated by scrubbing through the footage. This allowed me to improvise not just with the playback and opacity of materials, but also the movement of the footage, in relation to the performers.
I ended up with around 47 short pieces that, when combined with another 20 other video segments produced while testing the camera, form the basis of the visual performance. I transferred all of the 640×480 pieces, as is, from the camera across to my working directory for access through VDMX. The few extra pieces that weren’t in this format were cropped and output using MPEG Streamclip.
These pieces were separated into two video channels with the remaining 20 excerpts being placed in a third channel. Using 3 octaves of a Korg MicroKontrol midi controller keyboard, I set each white key to trigger two clips (one from each of the first two channels) that could be crossfaded between. The 3rd channel featured material set on each of the black keys. Clips were triggered on each press and would only remain while the keys were depressed. I could also control speed and direction as well as scrub through a video as mentioned above. While I had access to Blur, Luma and my adaption of Vade’s Rutt Etra, I only used the Luma once and found it to my satisfaction just manipulating the footage.
BARI had an interesting rear projection setup installed. My position as visualist was to the rear of the screen and I could see my footage and the musicians performing who could also see the footage from the front. It’s a little ropey at the start, and a little too black courtesy of some additive clips that were cancelling each other out and an aggressive Luma I forgot to switch off. Still towards the end of the performance we work up to a kind of synchretic tug of war. I could respond in relation to the music by scrubbing and changing footage in relation to the sonic gestures outlined. While this may not be the most compelling performance, particularly on replay, i’m happy with both the footage and see the potential for an interesting performative approach to be drawn from this.
I used the Canon 550D with a 50mm f1.4 lens – i should have used a higher ISO, opened the aperture more as the end result appeared much darker than it seemed on the LED screen when I was setting up. Audio was recorded using the ZOOM H4N and has not been touched… it could probably use some EQ as the tent wasn’t the most vibrant acoustic space… Need to do another test with better lighting. Consider moving up from 480p to 720p depending on processor power etc… This is particularly worth doing when i’m not relying on much in the way of effects (maybe a luma key?)