2016 – you bastard!

2016summary

If you open your eyes to social media it would seem that 2016 was an especially awful year. This could be believed if you managed to sleep through 2015 and plan to cryogenically freeze yourself for the next four years at least. Still – desperate times call for intense art so frankly i’m happy to not be in the real estate business.

From the perspective of my art it has been a better year than I’ve had for some time. I completed and released Black Mercury in June and for once I’m proud and satisfied that it features everything I represent in a manner acceptably advanced from Remnants and Mise En Scene.  It features experimental aesthetics but doesn’t overload, references popular music occasionally and doesn’t sound like Tim Hecker or Autechre.

I’ve also played at some great gigs (particularly at the Lindsay Crawford curated Oscilloscapes) with some strong improvisatory turns that I will hopefully nurture and extend into 2017.  For one I’m excited about the impending return of Small Black Box under the curatorship of David Loose, the best engineer for experimental music i’ve ever worked with.

After 15 years of collaboration Joe Musgrove and I have settled into an electro-acoustic improv rhythm that summarises our influences well as can be heard at our Bandcamp.  We seem to both be moving away from our elitist tendencies and even managed to make some ambient techno.

Likewise my collaboration with Paul Forbes-Mitchell is becoming more than the sum of noisy parts as we both develop our various hardware setups and improvisatory chops towards building levels of structure within the (necessary) chaos.

I also managed some interesting collaborations on the Ambient Online comps.


Midway through the year it became clear to me that the major work changes predicted were happening to Stacey rather than I.
Work as a sessional academic remains something I enjoy doing but the diminishing hours and lack of security is not great for family life.  For this reason it is great that my partner is finding satisfying ways to re-enter the workplace to ensure we aren’t slipping into the new year with the possibility of homelessness.  In 2017 I need to learn how to more effectively manage my home time split between family and art.

Major projects for 2017 continue to be financially unviable but hopefully include further development of the  Thought Forms audio-visual performance project (and potentially a return to the thesis) and work on some more structured, perhaps even song-like material that will hopefully contribute, however minimally, to a balancing of the world’s axis.

Be well.

Lloyd W Barrett

2016

newyearblah

 

Advertisements

Fun with Python, current build and distribution.

Managed to not record adequate video to convey the house gig / debut of “Thought-Forms” material from a couple of weekends back.  Here is all I got…

I felt the experience was positive and received some great feedback that set my mind into motion towards further refinement of the approach.  As always the balance of complexity, depth, playability and fun needs some tweaking.

This week an inspiring post from Tom Whitwell (creator of the Turing Machine modular sequencer) got me temporarily back into coding.  For those who’d rather not click – the gist of it is a python script that creates a Cagean set of directions for composition building.

I’ve modified Tom’s template to focus on descriptive text and Oblique Strategies for Audio and Video as this helps me build up an improvisatory framework for each “Thought Form”.   Here is my variation.screen-shot-2016-12-11-at-8-27-40-pm

screen-shot-2016-12-11-at-8-30-33-pm

screen-shot-2016-12-11-at-8-30-50-pm

If you’re not familiar with Python (which i’m not overly) Tom provided some instructions which i’ve amended below:

To run the script on a Mac (or another computer with Python installed):
Click ‘Download .zip’ and copy the file ‘thtfrms.py’ into a folder somewhere.
Open Terminal, type ‘cd’ and a space. Then drag the folder containing ‘thtfrms.py’ into the Terminal window and hit return. You’re now in that folder.
Now type: ‘python thtfrms.py’
If you look in the folder, you’ll find a text file with a made up name like ‘Qoyatenu.txt’. This is the complete score for your new album, entitled Qoyatenu.

The current hardware build has changed a bit since I last posted.

img_1385

After much back and forth I sold the RC-505 to Hhaarrpp in order to open up for some new approaches.

Just prior to the aforementioned gig I picked up a K-Mix (bottom right) which I’m using to replace the MOTU Ultralite (which takes up the display port required for a projector adapter.)  One of the most impressive things about it is how flexible it can be for different setups.  While initially I thought i’d use it as an audio-interface it’s currently working as a mixer (with 3*stereo send channels!!) and MIDI controller allowing me to focus all of the computer CPU on visuals.1 It will also be useful for getting back to live surround processing (if I can find a situation that warrants it.)

The Kaoss Pad returns to my setup as a performative send FX (a bargain Gumtree find – I can’t believe how much they go for new these days).  It is synced to the Novation Circuit which will add sequencer arrangement “glue” to the more abstract droning from the other synths.  So far my experience with the Circuit has been good.  It can make some quirky bass and pad sounds and the ability to load samples on it will ultimately prove useful when I can get around to making interesting percussion sounds to replace the default lame ones.  I am most impressed with the ability to work with it in bed, on public transport or wherever due to the battery function.

Another thing to point out – the Lemur template pictured is cobbled together especially for Lumen.  It features all controls mapped with some automatable controls for the LFOs, Oscillator Skew and X/Y pad.  It can be downloaded from here.

Work on Thought-Forms will progress over the summer holidays here in Australia. Will hopefully have some material to release and gigs to do though i’m still struggling to find an appropriate case for the Nord / Sys1m / Aira rack.  It is too wide for any suitcases i’ve found so far.

Finally – my recent EP Syntagma is out digitally at the usual places for download and streaming.

1 I was finding the Push 2 setup becoming quite laggy while running Lumen

 

Project: Thought Forms / Artist: Makrotulpa

A quick browse to the beginnings of this blog will show that at one stage I was focused on researching AV performance approaches.

I stopped in part because I started to feel that my goal of a performance setup where sound and image could be synthesised simultaneously in real time was being limited by rising CPU/GPU requirements and a lack of money to pay for them.  It also seemed a little ludicrous chasing the high end when my musical inspiration is more around small scale abstraction where individual perception joins the dots.

Today i’m happy to say the dream is close and I’m back experimenting with AV.  Here is an early demonstration.

I’m using Lumen, an exceptionally capable real-time visual synthesiser, that just recently received MIDI control capabilities.   I designed the Max For Live controller so that I can manipulate it from the Push, record automation and most importantly save parameters with PPTC which can store settings for any device on a per-channel basis.  So I can improvise and save the more interesting combinations.

I’m also using the Encoder Audio Max For Live sequencers quite extensively here.  Particularly the Source and Turing Machines going to the Synths and Polyrandom to Microtonic for percussion.  The Encoder Audio stuff is great to control from the Push 2 (with some fiddling) and sends mapping data to other controls which is where the AV connection will ultimately arise:  rather than doing the usual (and to me boring) envelope to opacity trick i’m more keen for sequenced modulation mapping from different elements to different elements.

All of this is early days.  It seems to be fine to play with so long as I smooth the modulations before hitting Lumen.  So far recording with Syphon has lead to a crash everytime so it’s not stable.  There is a gig in a month or so to work to and maybe some AV recordings.

Syntagmatic Relationships

I’ve just released something new to Bandcamp (coming soon to the usual online distribution centres.)

This is an EP of four ambient experimental tracks originating in multichannel improvisation recordings post-manipulated into structures.

Each track is a figure with an associated image as shown on the back cover.

back

To explain the glib attribution a burst water pipe flooded the studio in my house necessitating an urgent gear move and temporary setup.  So far it seems that nothing important has been damaged.

The “returning lights” comment refers to a return to audiovisual work after a number of years.  More details on that soon.

In recent times I also published a four track EP of improvised music with Paul Forbes Mitchell.  The primitivist aesthetic contrasts pretty well to Syntagma

 

Cloudbounce Blind Test Results

 

JPEG image-32A07E328359-1

My mixing/mastering DAW of choice, Tracktion, has just added CloudBounce as a feature.  This is one of those online mastering services like Landr, centered around a machine-learning algorithm.  The results of the blind test are below.

 

I tested Landr also when it launched and the results indicated it was expecting a more conventional kind of track – not the variable dynamics and lack of consistent transients common to electroacoustic improvisation (or “noise” if you’d prefer!) While i’m happy to explore procedural generation as a creative tool for visual art and sound design i’m not entirely convinced it has a place in finalisation of tracks as these choices seem to me to be highly personal.

JPEG image-FC54F14E4F2F-1

 

Having said all the above – while I’ve studied some of the approaches to mastering – I have no specific qualifications.  So I thought i’d do a blind test.  The playlist below features a track mixed by Joe from an session we did a month ago.  I’ve uploaded his raw stereo file to Cloudbounce to see what it could make of it.  I’ve also done my own mastered version in the box using only the software I have regular access to.  The results can be streamed or downloaded.

My questions to listeners:

  1. Which is the Cloudbounce master and which is my version?
  2. Which do you prefer?

RESULTS (including poll data and PMs)

Note very few people who PM’d me chose to guess which was the CloudBounce master however of those that did, and including the poll results, 60% of the vote guessed the second track was mastered through CloudBounce.

The reality is that the CloudBounce master was track 1.

The response was pretty unanimous that track 1 was the preferred track with greater definition.  It was also considered to be “louder”.  One user responded that they thought the second track was “dull” whereas another, while suggesting track 1 was a better mix, liked the softer focus of track 2.

Personally I was pleasantly surprised at the quality of the CloudBounce master. Of particular interest to me was the increased sonic space it brought out, situating the spiky transients more effectively amongst the deep / distant reverberant sounds.

One caveat is that my mastering attempt was not thorough.  I was working with a pre-mixed stereo track and merely applied Airwindows Console4(Buss and Console) with Melda Spectral Dynamics and TurboComp flattening the overall sound somewhat which perhaps accounts for the dull / unfocused result.  Clearly I have some learning to do.

I’m certainly keen to use CloudBounce services again.  If you are reading this and would like to test it out please click this referral link and let me know how you go.

 

 

Not all people like what I like and that’s just fine!

Here is a musical demo combining Push, Madrona Lab’s Kaivo and Zynaptiq’s new Adaptiverb which sounds like it could be an essential tool for us Ambient muso’s who use reverb as an instrument.

I haven’t been super productive over the last fortnight as I’ve been spending most of my spare time with No Man’s Sky, the latest implementation of a thing I very much like, exploring procedurally generated worlds.

NMS_Mania
My ugly mug in a NMS landscape via Pikazo

Noctis is probably the earliest obsession in this regard though i’ve always played flight simulators and open world RPGs for much the same reason.  While there is an overarching story to the universe it is mostly flavour to set atmosphere.

NoctisIV
Noctis IV

I showed NMS to Joe and he agreed it was “Noctis on ‘Roids”.

nms_example
No Man’s Sky – can’t capture screenshots without Steam – very tech-no-logic-al.

What has been slightly depressing is watching the online tantrums among those who were clearly expecting something more like Star Wars.   Game elements are minimal and repetitive but comparable to similar titles with a similar focus on exploration and procedural generation.  While it is certainly buggy and unoptimised, something that is more glaring is the apparent disparity between what was demonstrated during development, feeding the hype, and reality.  This amusing video summarises it quite well.

It’s quite clear to me the first one does not utilise substantial procedural generation as the creatures look realistic… and dull.

I recall the same negative response to Spore – and this excellent article by Soren Johnson thoughtfully explores the chasm between ground-breaking theories and hard truths about game design.  I certainly think that NMS is an interesting case.  While it didn’t suffer the same amount of interference with a much smaller development team it certainly seems to be copping a similar backlash.

Library of Babel
The Library of Babel

One thing with Procedural Generation is that you can end up with the “Library of Babel” effect – illustrated above by a generated page from one book, on one shelf, on one wall of the section in the virtual library generated by my birthdate in hexadecimal format.  A real page-turner!

If the player can bring some of their own imagination to NMS then I feel it is quite an engaging procedural sandbox but placing this in a commercial game is clearly problematic.  Noctis IV was a free game as is the more serious / less gamey Space Engine.   Artmatic Voyager (from the makers of Metasynth) is a payware title that generates Procedural Worlds but not real time environments; rather it renders attractive backdrops for you to populate with your own sci-fi assets.

artmaticExample
Artmatic Voyager

Perhaps the most promising procedural worlds generator on the horizon is Ultima Ratio Regum – but that appeals more to the rogue-like obsessed who also worship at the altar of Dwarf Fortress (nothing wrong with that!)

Anyway – that’s my two cents.  To return to the more music focused purpose of this blog – Ohmwrld is something I worked on with Joe that is inspired by virtual worlds and features cover art generated by Bryce, one of the earlier strands of Artmatic Voyager.

And here is something that sounds procedurally generated (but is actually improvised) that I made recently with Paul Forbes-Mitchell.

Final thoughts – the death of Makrotulpa is exaggerated – something new is coming.

A space for sonic contemplation

With no specific projects on the horizon it is of course time to make some.  As is always the way – I’m refining my current setup.  Regular readers may remember this picture from times past…

oldsynthtable

Well I’ve returned to this idea somewhat and managed to find a neat way to fit everything and maintain a workspace for recording and mixing with room to also work on the educational stuff that pays rent/bills.  Having a focused desk makes it easier to focus on improvisation and development without mouse tweakage.

LoopstationjustTable

All the hardware runs through the MOTU which is controlled via Push and Ableton.  V-Synth is being used as a source and a controller.  I discovered the X/Y Pad is great for modulating the System-1m filter.  3 stereo channels out from the MOTU go to the R16 for recording.  There is also a Send/Return that goes via the Aira FX, one Send that goes to the Loopstation returning as a stereo feed to the R16 and one that goes to the V-Synth for resampling of material.

gearPano

I’m going to try and keep this arrangement for a while as it seems pretty comprehensive and useful.  With the upcoming Live 9.7 features, especially the ability to select ins and outs, will make this setup even more viable.  Here is an ambient improvisation I recorded.