Last fall, I was excited to be chosen as one of the musicians to accompany modern dance classes for UW-Milwaukee’s Peck School of the Arts. This is something that I’ve wanted to do for some time, but until last fall there were no openings. I had previously attended a handful of classes on a voluntary basis, joining my friend Seth Warren-Crow on percussion and electronics, and thoroughly enjoyed the process of improvising for dancers. After all, music for dance has been one of my primary artistic outlets over the last few years, and to have the opportunity to practice this craft on a weekly basis is most welcome.
I’m now into my second semester of accompaniment and am developing more methods and techniques to use in the class. While I continue to bring my saxophones (alto/tenor) for a strong physical/acoustic presence, most of my development involves my laptop set up. This is a somewhat obvious statement for me, as the laptop work provides the primary foundation, while the saxophone embellishes the laptops pitch-based material and aids sectional structure through phrase development. Also, I’ve been playing the saxophone for decades, and have become quite comfortable in a variety of improvisational settings; the laptop environment provides me with new challenges and opportunities.
From my experiences with Seth, I observed that the rhythmic/pulse needs of the class can suddenly change. This requires a dynamic response on the accompanists behalf; moving from a strong pulse to ambient and back. Some of my previous work with Max For Live patches was certainly going to be beneficial. I spent the previous semester fine tuning how I use my primary rhythmic M4L patch, C8 Sequencer; adjusting instances used, chance to trigger a sound, polyrhythmic complexity, and how this all is impacted by tempo. I learned a lot about controlling all of this in real time and will be making some changes to exactly how this is all controlled. For the time though, I’ve put together a demonstration/example of what’s been done so far and posted it to my SoundCloud account. I imagine there will be similar posts in the future.
In the last couple months, I’ve been getting some new material together for the purpose of solo laptop performances. I’ve used my laptop many times in a group setting, but never in a stand-alone performance. It’s quite a different approach; switching from a single voice in an ensemble to constructing all voices for a live performance. Setting up instrument layers in the traditional multi-track environment, simply doesn’t accomplish what I’m trying to do. The challenge has been to make my hardware-software set up as flexible as possible, while maintaining a sense of simplicity in the controls. I’m of the opinion that trying to manage an unwieldy amount of parameters during a live performance can take away from what one might be able to create musically with less. That being said, I love complexity of layers, moving in and out of the listening field, enticing our ears with different shapes and colors. This is one of the reasons I enjoy playing with MiLO (Milwaukee Laptop Orchestra) so much.
So begins the balancing act. The stochastic-styled (randomization built around user controlled parameters) Max patches I use, are certainly helpful in this regard. They allow certain layers to essentially run themselves while user input shapes it’s outcome. Most of the time I’ve spent with Max patches, has been for the purpose of constructing meaningful randomization. The sample I have below is based around my use of the C8 Sequencer (Max For Live patch I posted earlier) for creating rhythmic material. A little bit of self-made loops (bass ostinato, pad and aux percussion that uses C8) round out of the sample.
The other little gem here, is the new addition to my sonic arsenal; the illustrious Electro-Harmonix Bi-Filter. The unit was designed by Mike Beigel of Beigel Sound Lab; the designer of the Mu-tron III (Stevie Wonder’s “Higher Ground” anyone?). This little rack mount unit is already proven to be tons-o-fun. I played around with the routing options inside of Ableton Live, and I think I’m getting closer to what I would use live. Options for setting up the audio output of a single track for either: Sends Only, External channel, or to the Master track, allowed me to use the Bi-Filter in different ways for different voices. Though I would have loved to see separate inputs for each filter section an option, the routing inside Live allowed for variety of usage within an FX unit with only one audio input.
The sample below is a real-time performance. No over dubs, or track editing. Only light compression, normalization and fades were applied post performance.