It’s been a while since I posted to this blog, but after a night out, I felt inspired by some awesome artists that live in the city I call home. In VJing is there is no “one way” or “right way” to do anything. It’s an openly expressive medium that offers up incredible opportunities for collaboration and creation.

On June 22nd, 2018, {arsonist}, Char Stiles, Spednar, REW, and Jeremy Bible took over Pittsburgh’s Mattress Factory for an evening of immersive sights, sounds, and a sort of contemplated musical-emotional therapy. The place was packed and entranced during each of the 3 sets. In this article, I reached out to all of them to explain their process and workflow. I’ll let the videos speak for themselves.

 

Set 1:  {arsonist} + Char Stiles -“eCosystem”

Mattress Factory Arsonist Char Stiles 2018

“Pittsburgh’s {arsonist} (Danielle Rager) and Char Stiles use live coded audio and visuals to create simulated ecologies. Musical phrases dynamically constrain rules of automaton interactions, leading to the genesis and destruction of synthesized life forms and their sounds. The symbiotic exchanges of the virtual organisms mirror the macro system of feedback between the model and the humans generating the collaborative A/V experience.”

Mattress Factory Arsonist Char Stiles 2018

{arsonist}

 “My music was a combination of my live electric violin playing, Ableton Live, and live coding language TidalCycles (https://tidalcycles.org/) which allows you to do very complex/intricate sequencing of audio samples using the concise syntax of a functional programming language that is built atop Haskell. I internally route the audio of the music I create in TidalCycles to Ableton Live using Jack (https://jackaudio.org/). Together Jack and a TidalCycles “parameter context” called “orbits” allow me to “stem” the music I make in Tidal out to Live in realtime, that is, have one channel input channel in Live for every channel of sound created in Tidal.  I make sure that my audio between Ableton Live and TidalCycles is synched to the same clock using TidaLink.

The show at the Mattress Factory was using a special 8 channel sound system. So my audio output from Ableton Live to 8 channels of outputs on my audio interface and then to the speaker system. I was able to use Ableton Live to program audio pans on the 8 channel system, or direct certain sounds to certain speakers. @CharStiles received a copy of my master audio output, which controlled parameters in her live coded visuals. She specifically frequency binned my audio input to get the gain for a few frequency bands and use these to make her visuals audio-reactive, as well as the MIDI note and duration information that I sent her from every track’s lead synth line. She can tell you more about her own live coding visual process and how she unpacked and used my audio and MIDI signals.” -Danielle

Char Stiles

Char was running Kodelife from two different laptops into a Roland HDMI mixer and out to the projector. Since Kodelife is a scene based live coding application it doesn’t offer smooth transitions from one scene to the next.  That’s where the HDMI mixer comes in, the audience didn’t have to bear through seeing the Finder interface.

She had audio, and midi input from Danielle to make the organisms dance to the music. Two parameters from Danielle were bound to variables in the shader: the frequency (certain part which was picked out by the binning interface in Kodelife), and the midi notes from the main “voice”.

A quick word on the content- she implemented a version of Conway’s Game of Life called smoothlife (https://arxiv.org/pdf/1111.1567.pdf) in one shader pass then used that as the texture for the corner to be a lookup for the ray-marched organisms in the second pass. She’ll soon have a tutorial on how she implemented ray marching in Kodelife.

Here’s a promo from their set together:

 

 

 

 

Set 2: Spednar + REW – “Computational Chiaroscuro”

Mattress Factory Rew Spednar 2018

Pittsburgh’s Kevin Bednar and Rachel Wagner have created an audio/visual piece which entwines minimalism and maximalism through a monochromatic lens. The performance presents fleeting and evasive visual states informed by shifting rhythmic structures. Composed using Python, vvvv, Processing, and Resolume – the visuals augment an intentionally absent aural pallet to complete a multisensory experience.

Mattress Factory Rew Spednar 2018

Spednar

Spednar used tidalCycles to code the audio live, Dirt to send the audio to 8 output channels, and jackmixer as a graphical mixing tool to midi map the 8 channel levels. To explain this in detail, he compiled a github repo about the process https://github.com/kbdnr/Tidal-multichannel

REW

Rew: “For the Computational Chiaroscuro visuals, there are two types of videos that overlap. The first type are the shape layers, created by Rew in After Effects, and these act as masks for the second type of videos, which were created by Spednar. To make these, he programmed random alphanumeric characters to appear and scroll across the computer screen and they were screen recorded. Rew’s set up to perform with these visuals includes her computer with a GeForce GTX 1080 graphics card and an AMD RYZEN 7 1700 8-Core CPU, the software Resolume Avenue 6, an APC20 midi controller, a Korg nanoKONTROL2 midi controller, and a Focusrite Scarlett 2i4 audio interface. The audio interface is what Spednar and Rew use to get the visuals to react to the audio; Spednar’s audio signal is sent through it to Rew’s computer, then Resolume Avenue uses the audio FFT from it. Additionally, the artists have the ability to record the visuals + audio straight from Rew’s computer using an AVerMedia 2 capture device and an HDMI splitter.”

Here’s a snippet from their performance:

 

 

 

 

 

Set 3: Jeremy Bible – “Human Savagery”

Jeremy Bible Human Savagery Mattress Factory 2018

Human Savagery (in 8.2 channel surround sound) pairs Bible’s powerful & moving, symphonic ambient compositions with his vivid ultra-HD aerial cinematography – which contrast the alien beauty of untouched mountains and desert landscapes with the chemical violence of EPA Superfund sites ravaged by industry. Bible has visited sites across the United States to create this footage, capturing a gripping snapshot of humanity’s often unseen footprint on this planet.

Jeremy Bible is an American transdisciplinary artist focused on the intersection of sound and light. Taking cues from his background in field recording and photography, his work occupies a shared space between multi-channel audio-visual installation and compositional practice. Utilizing acousmatic sonic diffusion techniques, Bible strives to construct an environmental experience that is at turns profoundly sensory and beautifully surreal.

Mattress Factory 2018 Drone AV performance

https://jeremybible.com/humansavagery

 


2 Comments

Invursion (@invursion) · August 14, 2018 at 10:44 am

Very happy to see you post again, ProjectileObjects. Love your insight and seeing how visuals continue to develop into a unique experience.

Comments are closed.