Project: Thought Forms / Artist: Makrotulpa

A quick browse to the beginnings of this blog will show that at one stage I was focused on researching AV performance approaches.

I stopped in part because I started to feel that my goal of a performance setup where sound and image could be synthesised simultaneously in real time was being limited by rising CPU/GPU requirements and a lack of money to pay for them.  It also seemed a little ludicrous chasing the high end when my musical inspiration is more around small scale abstraction where individual perception joins the dots.

Today i’m happy to say the dream is close and I’m back experimenting with AV.  Here is an early demonstration.

I’m using Lumen, an exceptionally capable real-time visual synthesiser, that just recently received MIDI control capabilities.   I designed the Max For Live controller so that I can manipulate it from the Push, record automation and most importantly save parameters with PPTC which can store settings for any device on a per-channel basis.  So I can improvise and save the more interesting combinations.

I’m also using the Encoder Audio Max For Live sequencers quite extensively here.  Particularly the Source and Turing Machines going to the Synths and Polyrandom to Microtonic for percussion.  The Encoder Audio stuff is great to control from the Push 2 (with some fiddling) and sends mapping data to other controls which is where the AV connection will ultimately arise:  rather than doing the usual (and to me boring) envelope to opacity trick i’m more keen for sequenced modulation mapping from different elements to different elements.

All of this is early days.  It seems to be fine to play with so long as I smooth the modulations before hitting Lumen.  So far recording with Syphon has lead to a crash everytime so it’s not stable.  There is a gig in a month or so to work to and maybe some AV recordings.

Syntagmatic Relationships

I’ve just released something new to Bandcamp (coming soon to the usual online distribution centres.)

This is an EP of four ambient experimental tracks originating in multichannel improvisation recordings post-manipulated into structures.

Each track is a figure with an associated image as shown on the back cover.

back

To explain the glib attribution a burst water pipe flooded the studio in my house necessitating an urgent gear move and temporary setup.  So far it seems that nothing important has been damaged.

The “returning lights” comment refers to a return to audiovisual work after a number of years.  More details on that soon.

In recent times I also published a four track EP of improvised music with Paul Forbes Mitchell.  The primitivist aesthetic contrasts pretty well to Syntagma

 

Cloudbounce Blind Test Results

 

JPEG image-32A07E328359-1

My mixing/mastering DAW of choice, Tracktion, has just added CloudBounce as a feature.  This is one of those online mastering services like Landr, centered around a machine-learning algorithm.  The results of the blind test are below.

 

I tested Landr also when it launched and the results indicated it was expecting a more conventional kind of track – not the variable dynamics and lack of consistent transients common to electroacoustic improvisation (or “noise” if you’d prefer!) While i’m happy to explore procedural generation as a creative tool for visual art and sound design i’m not entirely convinced it has a place in finalisation of tracks as these choices seem to me to be highly personal.

JPEG image-FC54F14E4F2F-1

 

Having said all the above – while I’ve studied some of the approaches to mastering – I have no specific qualifications.  So I thought i’d do a blind test.  The playlist below features a track mixed by Joe from an session we did a month ago.  I’ve uploaded his raw stereo file to Cloudbounce to see what it could make of it.  I’ve also done my own mastered version in the box using only the software I have regular access to.  The results can be streamed or downloaded.

My questions to listeners:

  1. Which is the Cloudbounce master and which is my version?
  2. Which do you prefer?

RESULTS (including poll data and PMs)

Note very few people who PM’d me chose to guess which was the CloudBounce master however of those that did, and including the poll results, 60% of the vote guessed the second track was mastered through CloudBounce.

The reality is that the CloudBounce master was track 1.

The response was pretty unanimous that track 1 was the preferred track with greater definition.  It was also considered to be “louder”.  One user responded that they thought the second track was “dull” whereas another, while suggesting track 1 was a better mix, liked the softer focus of track 2.

Personally I was pleasantly surprised at the quality of the CloudBounce master. Of particular interest to me was the increased sonic space it brought out, situating the spiky transients more effectively amongst the deep / distant reverberant sounds.

One caveat is that my mastering attempt was not thorough.  I was working with a pre-mixed stereo track and merely applied Airwindows Console4(Buss and Console) with Melda Spectral Dynamics and TurboComp flattening the overall sound somewhat which perhaps accounts for the dull / unfocused result.  Clearly I have some learning to do.

I’m certainly keen to use CloudBounce services again.  If you are reading this and would like to test it out please click this referral link and let me know how you go.

 

 

Not all people like what I like and that’s just fine!

Here is a musical demo combining Push, Madrona Lab’s Kaivo and Zynaptiq’s new Adaptiverb which sounds like it could be an essential tool for us Ambient muso’s who use reverb as an instrument.

I haven’t been super productive over the last fortnight as I’ve been spending most of my spare time with No Man’s Sky, the latest implementation of a thing I very much like, exploring procedurally generated worlds.

NMS_Mania
My ugly mug in a NMS landscape via Pikazo

Noctis is probably the earliest obsession in this regard though i’ve always played flight simulators and open world RPGs for much the same reason.  While there is an overarching story to the universe it is mostly flavour to set atmosphere.

NoctisIV
Noctis IV

I showed NMS to Joe and he agreed it was “Noctis on ‘Roids”.

nms_example
No Man’s Sky – can’t capture screenshots without Steam – very tech-no-logic-al.

What has been slightly depressing is watching the online tantrums among those who were clearly expecting something more like Star Wars.   Game elements are minimal and repetitive but comparable to similar titles with a similar focus on exploration and procedural generation.  While it is certainly buggy and unoptimised, something that is more glaring is the apparent disparity between what was demonstrated during development, feeding the hype, and reality.  This amusing video summarises it quite well.

It’s quite clear to me the first one does not utilise substantial procedural generation as the creatures look realistic… and dull.

I recall the same negative response to Spore – and this excellent article by Soren Johnson thoughtfully explores the chasm between ground-breaking theories and hard truths about game design.  I certainly think that NMS is an interesting case.  While it didn’t suffer the same amount of interference with a much smaller development team it certainly seems to be copping a similar backlash.

Library of Babel
The Library of Babel

One thing with Procedural Generation is that you can end up with the “Library of Babel” effect – illustrated above by a generated page from one book, on one shelf, on one wall of the section in the virtual library generated by my birthdate in hexadecimal format.  A real page-turner!

If the player can bring some of their own imagination to NMS then I feel it is quite an engaging procedural sandbox but placing this in a commercial game is clearly problematic.  Noctis IV was a free game as is the more serious / less gamey Space Engine.   Artmatic Voyager (from the makers of Metasynth) is a payware title that generates Procedural Worlds but not real time environments; rather it renders attractive backdrops for you to populate with your own sci-fi assets.

artmaticExample
Artmatic Voyager

Perhaps the most promising procedural worlds generator on the horizon is Ultima Ratio Regum – but that appeals more to the rogue-like obsessed who also worship at the altar of Dwarf Fortress (nothing wrong with that!)

Anyway – that’s my two cents.  To return to the more music focused purpose of this blog – Ohmwrld is something I worked on with Joe that is inspired by virtual worlds and features cover art generated by Bryce, one of the earlier strands of Artmatic Voyager.

And here is something that sounds procedurally generated (but is actually improvised) that I made recently with Paul Forbes-Mitchell.

Final thoughts – the death of Makrotulpa is exaggerated – something new is coming.

A space for sonic contemplation

With no specific projects on the horizon it is of course time to make some.  As is always the way – I’m refining my current setup.  Regular readers may remember this picture from times past…

oldsynthtable

Well I’ve returned to this idea somewhat and managed to find a neat way to fit everything and maintain a workspace for recording and mixing with room to also work on the educational stuff that pays rent/bills.  Having a focused desk makes it easier to focus on improvisation and development without mouse tweakage.

LoopstationjustTable

All the hardware runs through the MOTU which is controlled via Push and Ableton.  V-Synth is being used as a source and a controller.  I discovered the X/Y Pad is great for modulating the System-1m filter.  3 stereo channels out from the MOTU go to the R16 for recording.  There is also a Send/Return that goes via the Aira FX, one Send that goes to the Loopstation returning as a stereo feed to the R16 and one that goes to the V-Synth for resampling of material.

gearPano

I’m going to try and keep this arrangement for a while as it seems pretty comprehensive and useful.  With the upcoming Live 9.7 features, especially the ability to select ins and outs, will make this setup even more viable.  Here is an ambient improvisation I recorded.

Hand-me-down

Today my mother gave me a Cedar book stand that my grandfather made.

I tried setting it up with the Push – it was pretty interesting but not as useful as it looks – mainly because the angle means it can’t stand in front of the synth rack.

IMG_1001 IMG_1003

So I’ve found a more interesting use for it.  I’ve got the old Ikea stand back for the Push and now I have a front facing mixing desk.

IMG_1007 IMG_1005

Unconference Showcase went pretty well.  PFM sounded fabulous, Feet Teeth filled the auditorium with menacing neo-kraut-rock and the Sonic Manipulator was… well…

IMG_0994

…undescribable?  He made everyone happy.  My intro performance with just Shnth and Loop-Station went down well also.   With the new firmware I can add compression and EQ to make it sound better in the room.

Catching up with Chronos

It’s been a while… managed to shift some piles of work into the “Out” tray just in time for…

NIME + Unconference

The New Interfaces for Musical Expression conference is on in Brisbane this year.

Some very exciting stuff to see / hear / talk about and I’ve been tasked with organising the Unconference at The Edge.  Amongst the guests we are blessed with DJ Sniff and the Sonic Manipulator demonstrating their approaches to performance and composition.

The Sonic Manipulator will also be performing at the showcase gig in the evening with fab local innovators, Feet Teeth, Hetleveiker and LopezDonado.

UNCSHW

Noise Wall

Joe Musgrove, Adam Sussman and I provide audiovisual support to the closing night party of Ali Bezer’s Noise Wall.  Was pretty chaotic and we ended up losing the audience war to a nearby coast music band (think somewhere between RHCP and Led Zeppelin) but overall a fun night and some unique sounds.  Here is an excerpt.

Reaktor 6

In my period of catching up I’ve been getting back to Reaktor 6, particularly messing with the Blocks now that the user library is presenting some very mature options.  Michael Hetrick’s Euro Reakt Blocks are essential as is the Instrument Browser for drag / drop time saving.

Screen Shot 2016-07-09 at 4.40.12 pm

One thing I have found kinda irritating is the Push knobs not being available for Blocks.  Despite PrEditor making them available for most ensembles the controls aren’t always consistently added which makes sense when you consider these are discrete modules to be connected as you wish in a more temporary fashion.

The best solution i’ve found is to add a M4L CC sender patch like ControlChange8 from Robert Henke and use Reaktor’s MIDI learn.  Saves the messing around for making crazy sounds.

Loopstation

Last bit of news – a surprise firmware update for the RC505 which has been crying quietly in my cupboard for the last few months.  The key changes for me are the ability to have up to three simultaneous effects on both in and out though I’d like it more if there was a way to route certain tracks through certain effects.  There are also a bunch of additional CCs to allow external assignment (which makes the guitar players happy) and a useful ALL CLEAR function.

Most disappointing addition has to be the monitoring function.  The phone jack used to just output headphone sounds – now you can choose which track to hear but sadly it is post-gain for each looper which makes it kinda useless for monitoring in my opinion.  Would be nice to have the phones monitor the input or pre on individual tracks before making them live.  Oh well – a missed opportunity. Overall there’s enough goodness to get it out again though – I bashed out a track with it in celebration of Roland’s surprise continuing support of a device I’d assumed they’d forgotten about.