Tuesday, August 6, 2013

Non-Linear Playback; The Evolving Story

Seemingly all the sound effect designs I do these days have some non-linear elements.  But at the same time, the majority of the show is still effect by effect; often called and/or executed by the Stage Manager, and presented in one long list.



"The Wiz" (which is running now) is one of these standards.  My Sound Assistant is on headset, taking each cue from the Stage Manager.  I've built in dips (fade cues and restore cues) to take down the sound effects during songs, and these are also being called as lettered cues by the Stage Manager.

The one exception on this show is also the first time I've had someone other than myself executing an improvised effect.  To wit; the wind effect for the twister is meant to be artificial, to be more like a performance on a synthesizer patch.  Which it is; my Sound Assistant has a tiny Ozone keyboard by him, and he performs the wind effect each night.



This last weekend I opened and closed "Starmites."  Four-performance run.  In that case, once again, the majority of the cues were presented in linear order; they were programmed into QLab, run off a laptop, and the Stage Manager herself was pressing the "Go" button.

I had a second laptop at the FOH mixing position.  There, I had copies of several of the sounds, several background loops, a foley-type effect and an electric guitar patch.

Taking these in order; the tech was abbreviated and the cast sometimes uncertain of their actions.  So there were a couple of sound effects I had duplicated so if the kids jumped a scene, or we'd messed up and forgotten to put in a sound, I could fire it from my keyboard instead.  The nature of these sounds (mostly magical attacks) is such that it wouldn't be a big problem if we accidentally both played the sound at the same time.

Which in fact did happen -- but as always I had thrown master faders for the sound effects on to the top layer of the mixing desk, so it was an easy matter to fade out the duplicate manually.

The background ambiance cues were on my computer because of the harried tech.  Even though I had the sounds built, it was simply too much to add them to the Stage Manager's book during what were already difficult technical moments.  These were low-level looping background effects anyhow so it was fine to just add them in to taste from the keyboard with one hand whilst I mixed the show with the other.

The guitar was there because the MacGuffin of the show, "The Cruelty," is basically an evil electric guitar.  We were hoping the band would do some guitar stuff as the prop was revealed, but that never quite happened...so I did some random fumbling live on a nice crunchy patch with a lot of echo (and a ton of bending).

The foley....first time I remember doing something like this was for "Honk!" where there was a whole bit about a man with squeaky rubber boots.  So I threw boot squeaks on to two keys of my sampler and followed the actor as he walked around.  

This was a similar gag.



So far, however, the only actor-triggered effects I have had were the Duck's "Universal Remote Control" for "Click Clack Moo," and a pistol used in a production of "Tis a Pity..."   The latter used a 424 kHz radio link to trigger a QLab sound cue.  The former was using a quick Processing sketch to interpret an XBee signal and play back a sound.

Oh, and an intercom buzzer for "Moonlight and Magnolias."  That was a strange compound cue; the practical switch on the prop intercom was detected by an Arduino that spit a MIDI message all the way upstairs to where Sound Cue was running on a PC.  Usually, the secretary was on "the other end" which was an actress backstage on a microphone that was fed into the same speaker.  But at one point she "connects" several other callers, which were pre-recorded voice-over sessions -- and these were played back over Sound Cue as Stage Manager "called" cues.

I've now worked on that signal chain so I have now a custom Processing ap that reacts to various inputs from battery-powered XBee radios and spits out a MIDI signal that can be picked up by QLab or by a sampler or Max patch.  I've used it with a Staples "Easy" Button modified for XBee wireless link, and with a basic accelerometer setup on a wrist band...but neither has yet been in a show.

Next on my programming chores is to add the ability to select a sound file for playback from within the Processing ap, to allow skipping the MIDI step entirely for simple shows.  But at the moment, that is my state-of-the-art in non-linear sound.


No comments:

Post a Comment