So this was Scott Baker and my debut test performance. More info below…
I have 4 monophonic channels of Nord Modular goodness labelled Organum / Red / Green / Blue.
The RGB channels are output, predictably, to the RGB of the Synchronator. The Organum is just a drone that creates a bed for the other sounds to sit on.
All channels are also bussed through their own channel with a SaltyGrain granulator attached. This is where the majority of the live performance happens as i’m tweaking both the Nord parameters and Granular settings (using a NanoKontrol) in real-time.
The synchronator output is fed to a CRT which is then picked up by Scott’s DV camera which is sending the input via FireWire to his laptop for him to process. He is using Resolume with some Quartz Composer plugins and processing the input in realtime.
We had some issues with the video at the start that was probably my fault. I setup my phone to do a guerilla recording in case the room record didn’t work. On setting up it fired a quick flash to set lighting which seems to have confused the DV camera. I was reasonably happy with the performance. For droney stuff it’s a pretty variable performance setup and I can have satisfying control that is both fine and fairly drastic in equal measure. The live improvisation didn’t flow as smoothly as some of the sound cloud demos but it certainly wasn’t awful. I kinda wished I wasn’t sitting right next to the sub-woofer though as it coloured my perception of the output substantially.
Aside from the initial issues outside of Scott’s control (and a lack of time to prepare due in part to Brisbane’s shitful traffic jams) I was happy with the visual elements and can see the minimalist aesthetic being extended effectively in further iterations. I’m going to produce several more live jams with this setup and send the separate files to SB so that he can run them through the process and refine it. Hopefully we will get another chance to live demo it.
With my (currently dormant / deferred) PhD research I was looking at different approaches to AudioVisual correlations that could be presented in live situations. One of the end goals is to produce a series of “Studies” that help illuminate diverse performative approaches. In my initial run of examples I used the synchronator for a Study in Direct Causal Transduction and used Runes processed through Metasynth to form the Study in AV Translation. The end result of this approach was somewhat lacking and I feel that Folding Time is a more interesting and eloquent example of AV translation where the sound is indirectly converted to image via the synchronator/CRT/DVCam setup. It also allows for a level of performative creativity on both mine and Scotts behalf that compares well to the approach Andrew Thomson and I took for N4rgh1l3 performances (though they were very much Indirect).
I’ve been using Numerology a lot lately as alongside the Launchpad it has become the most useful tool for controlling the Nord Modular. I am however going through the familiar aversion to being tied to a sequencer. Since all the elements are all out of the Gator case now it is easier for me to reconfigure for different approaches and i’m interested in finding something outside of the grid that is a little more immediate / less automated – inspired in a way by PantoMorf.
ISOController is a decent alternative for using the Launchpad and has a working standalone version but I would need to find a way to separate MIDI channels. It is really better for solo / lead parts or working with the CIM improvisation algorithms that I was helping test.
Ditching the Launchpad and moving to the iPad opens up some different approaches.
Liine LEMUR offers a couple of inspired examples:
ABreakpoint 2 looks pretty wild but may be too hands-off for what i’m after.
Likewise Enchord is useful for generating complex melodic patterns within a set key but it is a little difficult to use in a live situation due to small assets and multiple screens to navigate.
I may need to build my own template either in LEMUR or in Max/MSP (with the Mira app) though i’d rather not as i’m over the ratio skewing towards learning and building as opposed to making and mixing. What i’d really like is something a little like the Virtual ANS app. Four screens – one for each NM slot – where I can “paint” notes while the playhead loops.
Maybe all these fancy controllers are getting in the way – perhaps I need to just return to the Graphite?
I’ve given up trying to fulfil the RPM challenge for now though i’ve definitely made more than 35mins of music this month – just that none of it fits into the RPM guidelines. Below is the one loose track I finished. There are two other tracks appearing on a compilation soonish – will let you know when there is something to know about that.