May 8, 2019
I spent a good few hours this weekend working with PraxisLIVE, a programming environment designed for live performance.
That is, you can do livecoding in it, but also you can build out all sorts of live performance applications with it in advance. It uses a cool combination of Processing and a patch-based visual programming environment to create audio and visuals, the latter being my focus. I’m trying to learn it and make a few cool things to use for Bit Crushed 5.0, a local chiptune show at the end of the month put together by a buddy of mine.
Live visuals are a mixed bag for me. On the one hand, I really love the whole idea of creative coding and generating performance art software that I can manipulate on the fly. On the other, I have a really hard time just easing into it and creating stuff. I get too lost in the weeds seeking for perfection for a particular thing without even knowing what that thing is. It’s an issue I’ve had with almost all my personal projects, so resolving it is definitely becoming more and more of a necessity. The biggest help I think is going to be from just making things more regularly. An hour-ish a day instead of once a week makes the results feel less like precious pieces of art and more like sketches with potential for improvement. Plus, making them more regularly will just make me better at creating them (in theory).
PraxisLIVE reminds a lot of Quartz Composer, which is where I got my start with livecoding/realtime visuals. It was an OS X visual coding environment/language that created what could be considered shaders, in a sense. You could use the compositions as screensavers, import them into Apple Motion, and probably do a few other things I wasn’t aware of at the time. While you were limited to only the components they offered in the system, they offered a surprising amount of flexibility. The system also updated on the fly, meaning you could edit your compositions while they were running and you’d see your changes live. This was huge to me, as it allowed for all sorts of experimentation.
I wish I had more captured from my experiments with it, but unfortunately all I’ve got is this video of my senior capstone project, where I used it to create a music video of sorts in real time, complete with audio-reactive visuals and midi controls. You’ll have to excuse the poor image quality, the best I could do at the time was point a camcorder at the monitor.
PraxisLIVE provides a lot of the core components I remember Quartz Composer having, but with the added benefit of allowing for new custom components to be made. That’s where the Processing stuff comes in. Each component consists of a Processing sketch, where you can do all your audio/graphics/logic work. In the visual scripting side, you get an output of the expected type (audio signal for audio components, PImage for video patches), and you can connect those up to other component inputs for further manipulation. Plus, you can expose any number of custom inputs and outputs for all sorts of shenanigans.
This was made with two separate 3D Processing sketches, one connected to a video feedback simulator I slapped together, passed through a compositor that spits everything to the screen. Kinda chaotic, but I like it. And while it was running I was able to edit the code handling all the components and their exposed properties on the fly to test edits and see the results immediately. Very exploratory.
I’ll try to post some more brief posts with vids as I make new weird things with this, but in the meantime you can peruse the source and clone it yourself to mess around with.