Sunday, February 17, 2013

Tools

I'm between shows.  No design deadlines, and no props with deadlines either, now that the Maverick is shipped.  So this might be a good time to build some tools.


I've been having great fun with my wireless EasyButton.  Used it to run the lights on one (simple) show.  Used it a couple of times to check sound cues as I walked the house.  Has only been in one show so far -- or, rather, the guts were, transplanted into a television remote and used by the actor/narrator to "control" the action of the play.  (For that show, the XBee triggered a sound effect in a small Processing ap running on a laptop, which plugged into the sound board).


I've had even more fun with my simple MIDI button (this spits out MIDI over the standard 5-pin DIN connector when a switch contact is closed).  It has been in an orchestra pit twice to allow the percussionist to trigger a sound effect.  Another time it was wired into an intercom box to play a buzzer sound.  The flexibility of this is that the trigger is just a button or sensor; the resulting command is interpreted, usually by QLab, meaning we have total control over what kind of sound, where it is placed, etc.


I've been trying to dream up a new device along these same lines, that is a plug-and-play solution for any number of sound/lighting/effects coordination problems.  Something I could have in my bag, that would hook up quickly with minimum wiring to a variety of inputs/sensors; something we can use to explore the kind of dynamic cue'ing new technologies have made possible.




In reality, of course, a sound operator or Stage Manager with a sharp eye and their finger on a button is good enough to time most effects.  One of the major places this fails is in gunshots; no-one can push a button close enough to the exact moment the actor squeezes a trigger for it to look real.

But... there's a perceptual barrier in action here.  We tend to think of the potential of sound and lighting and practical effects during the action of a play in terms of, well, the kinds of sounds and lighting and practical effects we've seen before.  So we tend to approach it in terms of "Here's something we usually do by having the light board operator press a Go button.  Would this be better if it were automated?"

Stated that way, the answer is usually "no."  Because automation needs to be wrung out and retains a non-zero chance of failure.  And because non-automation means there is a human in the loop and a chance for human decision in case things go wrong.  And because, if you are talking a discrete effect that can be done with a button, hitting a button is usually close enough for the desired timing.

You have to think outside that theater box and think instead of an interactive environment.  A situation or place or prop with behavior that doesn't break easily into discrete "cues."

A simple case in point is a doorbell.  Or the intercom buzzer I spoke of above.  But then, in theatrical thinking this is not a "interactive automated non-cue."  Instead it is a "practical prop."  Which is to say; like the starter pistol used for many on-stage gunshots, it is an effect that is operated by the actor.  Or, rather, the "effect" is the result of the actor's interaction with a functional prop.

It is actually a significant paradigm break to realize that such a "practical" effect can be played through the sound system, the same way as other sound cues.  The tendency is to think that a doorbell is a doorbell, so you mount it on the door.

From a designer standpoint, if it is part of the sonic environment of the play, it is part of the sonic environment of the play.  I want the same control in placement (through various speakers), volume control, and of course the make-up of the actual sound.  So we de-couple the "practical" nature of the doorbell so that the actor presses a button, yes, but then it becomes an event quite similar to an ordinary sound cue, inside the system which is playing back the rest of the soundscape.


Let's break the paradigm completely.  Imagine a candle.

In the real world -- and, better yet, in the world I want to create as a lighting designer -- a candle in the hand will flutter and gutter when moved.  When it is set down, it stabilizes (unless there is a mysterious wind...!)  Also, of course, the light it casts is around it.  More subtly, our eyes adjust to the candle; if it is dark when it is lit, the first flare will be uncomfortably bright, then our eyes adjust.  And if the first moment is dousing the rest of the lights, then the candle will seem dim at first but then we will adjust until it is a warm, comfortable glow.

Not all of this is necessarily part of the world I want as a lighting designer.  That depends on the context, the total lighting environment, the mood and pacing of the play, the realism of the production.  A candle for "A Streetcar Named Desire" will be different from a candle for "Dracula."

As a designer, some of what I might want to achieve can be achieved with standard lighting tools.  Some is best done that way (after all, if the flare of the candle is going to reveal Dracula, it really should happen after the actor is in place and ready!)

What the electronics gives us are additional tools to achieve the desired effect.

If the flickering is important to me, well, the most efficient solution is a motion-sensitive candle.  The prop itself flickers when it moves.  If what is important to me is that a lighting cue follow instantly when the candle is lit, then it is best if the candle and the light board can talk to each other.  Either the candle tells the light board to bring the lights up, or the light board controls the candle so the timing is correct.

And a tricky thing here...because we are bringing DATA into the light board, the interactivity isn't constrained to "take a cue."  Instead we could do direct control over the level of a warmer.  We could even control the position of a moving light, moment to moment.



In any case.  These thoughts aren't new.  The position I find myself in, however, is that knowing these kinds of solutions are possible is not a help when;

A) The potential for this kind of interaction hasn't occurred to anyone else, thus, the action -- even the script -- has already been modified by time you get there so it isn't necessary to do so anymore.

B) The development cycle of tech is too short to allow construction of new interactive devices from scratch.

What I see as the solution is some kind or kinds of pre-made multi-purpose devices.  Like my EasyButton, these would be "close enough" where they could be modified during tech and dropped into the production.

Which is what happened on "Click Clack Moo."  I brought the EasyButton out for a rehearsal, got a go-ahead to do the practical prop, and moved the circuitry into the remote in time for tech.  The only stumbling block (and it was a huge one) was having to use a version of QLab that didn't have the MIDI functionality license.  So I had to change the software layer on the fly.

The more that these solutions prove themselves, the more it will get into the heads of directors and producers and the rest of the design team that such things are possible.

But the first step remains; building the black boxes themselves.



The problem is, of course, trying to make something general-purpose enough so it can be adapted to the needs of the moment.  But that is "built up" enough that it doesn't take weeks to adapt it!


In a way a better test case for demonstrating interactivity is the gun problem.  I have a new version of that.  Way back on "Tis Pity She's a Whore" the actor had a plastic gun and a radio transmitter hidden in his other hand.  He coordinated the action himself.  This was an older version of my MIDI box; this one, though, picked up the short-ranged signal from the keyfob transmitter.  It was the extremely short range of the radio that made me upgrade to XBee chips instead, but I do miss the handy keyfob.


My newest idea on this is an all-purpose "trigger finger."  Instead of modifying the prop gun to put a sensor on it, put a sensor on the actor.  My thought at the moment is basically a one-finger Data Glove; one flex sensor in a finger cot, then enough software tweaking to detect a trigger pull as a distinct event.

Then that is linked via XBee to a base station which turns it into, eventually, sound and possibly other events.

My current thinking and experience is that at least during development the XBee should pass on continuous sensor values (this is less battery efficient but still good enough for theater).  Then the receiving XBee node is plugged directly into the USB port of a host computer.  This allows me to trim the sensor parameters either in software or via a GUI, meaning a third-party user could set the thing up.  Or set up a "learning" loop, for that matter.

The Processing or Pure Data ap could handle a sound file natively, or spit out software MIDI to be then picked up by QLab or a software sampler or another ap.  Plus it could also send commands via the XBee network or a serial connection to other hardware.

Of course the theater I'm currently at is Speilburgian.  It is unlikely they'll have a gun in any production soon.

However.  Most of the principles could work the same way for, say, a magic wand.  Way back on "Oliver!" I really wanted the Beedle to be able to make a huge sound by stamping his staff of office on the ground.  Given an accelerometer, such would today be quite possible.

The trick is working out a form factor that is sufficiently flexible.

The idea of the "trigger finger" is the bulk of the electronics is carried by the actor.  Nothing is attached to the prop.  The actor can walk around, pick up the prop, do the moment, then put the prop away again -- all without hassling with wires and connectors and switches.  They would simply wear this effect just like they wore a wireless microphone.

The only real trick would be a software interlock; the host application would be sent a cue to "unlock" it, or take it off "safe," just before the gunshot moment.  Then the software would be inhibited again following the moment, to prevent misfires.

It would seem to make sense to stuff the circuitry inside the Beedle's staff, but I am just thinking now that the smarter form-factor might be the same thing; a wrist-band for the actor, worn under the clothing.  Sure, the actor would be more comfortable if the staff carried all the wires and batteries and all that.  But the advantage to a device that straps to the actor is that it is, again, generic.

Not that the Beedle is going to go around pounding on random objects.  But NEXT show, someone else might hammer down a beer mug.  Or flick a wand.  Or a sword.

Basically, I'm talking a Wii here.  A wireless accelerometer, and host software that interprets the movements to create actionable software events (aka, MIDI note events, MAX/msp type patches, etc.)

And something like this might already exist.  I hope so because the requirements are several.

It should be very small, very light, with a battery life in the 4-6 hour range minimum.  It should provide the possibility of positive feedback.  It should permit real-time two-way monitoring (which makes system check a LOT easier).  It should be cased, able to withstand knocking about and being sweated on.  It really should use rechargeable batteries...most efficient would be LiPo, with included charging circuitry and jack.

Basically, the thing looks like a remote sensor.  Oh, yes.  And should also include access to the XBee for field re-programming.

The main question is whether to try to modularize the sensors.  I suspect that although for testing purposes a glove would work, to permit it to be hidden properly it need to be in a band that can be tucked up as high as upper arm.  As long as it is on the forearm, the accelerometer will function; so that could be installed permanently.  But a finger sensor should be plugged in...so add a jack or two for external, software-configurable sensors.

Given LiPo batteries, power switch, and two-way status display, this box would be just as functional off the actor; attached to a door frame for a doorbell, say.  So it seems like a very good form factor for a generic sensor device.

Oh...strikes me that although it isn't as much fun as curling a finger, you could trigger a gunshot by jerking your arm back as if with the recoil.  So it seems smart at the moment to make the expandable box first, and worry about finger sensors and similar later.

V2 of this should really be a custom board -- include the LiPo charger, level shifter if necessary, FTDI (XBees can be programmed remotely but it is a pain), accelerometer.  For V1, probably just put stuff together on proto.  For the nonce, the sensor end will be "dumb," -- a better version would add an ATtiny for sleep management and on-the-fly programming of the XBee for power management.

So maybe now the place to look is those jogger's pouches for iPods and see what can be adapted to hold a somewhat larger electronics pack...


No comments:

Post a Comment