Thursday, June 20, 2013

Wizardry

So now it is official.

I have several projects I've promised to work on, with a deadline already looming.

1.  Put LEDs on a hat.

2.  Put a strip light on voice control.

3.  Create a gestural interface.

4.  Keep working on my POV wand if I get time.


The first is a no-brainer.  My "Cree" 3W RGB's arrived in the mail, as did several meters of 6mm light pipe.  I can do this with some 1 watt resistors (I've learned THAT lesson!) but I'm holding out for the constant-current drivers I also put on mail order.  I don't have any strips or really many discrete LEDs, though.  I was pricing some nice 50ma greens in Pirahna style cases at Digikey, though.  But I think there's enough time to wait and see what the hat looks like before I have to make another shopping trip.

This might just be hard-wired.  The most complicated part of it might just be the power button.  But...I am fully prepared to go RGB with it, and do some chases or something.

At the moment I am lacking most of the "DuckNode" system I've been trying to design over the past few years.  Which is to say; I don't currently have a plug-and-play solution to send wireless control out to the stage and into a costume for an effect like this.

(The local Orchard Supply Hardware just got some vintage style oil lamps in stock.  I'm pretty strongly tempted to stick a Cree, an ATtiny implementation of my custom "blink" code, and an XBee for hand-rolled remote control into one of them.  Except the oil tank is small enough I'd probably want to use a LiPo for power, with a USB jack and a charging circuit.  And that makes this an $80 lamp....)



2.  I picked up an 8-channel EQ display chip, which should do just fine for a voice control.  Except it spits out analog data serially, which means you pretty much want a micro to control it, and I'm starting to run out of free AVRs (Arduino or not).  I may just have to solder up another protoboard with an ATtiny on it, and do more pretend-it-is-an-Arduino clone because I just don't have time to think my way through straight C and the AVR toolchain.

The tough part is controlling anything more than a strip of LEDs.  I've been reading up on triac dimming, and I'm really not wanting to do hasty work around raw AC.  So the best options at this point for controlling lights at line voltage are;

     a.  Power Tail.  This is a plug-and-play relay box for 15 amps.  At 10ms cycle time it isn't going to do PWM, but I'm willing to bet that timing a pure off and on is going to be close enough for a Dalek Eyestalk effect.

     b.  Cheap Dimmers.  American DJ type baby dimmer packs are as cheap as ninety bucks on Amazon.  Only four channels at a whopping 5 amps each, but it is a solid metal case and more-or-less UL listed.  Also the one I'm looking at will take a 0-10 analog control signal, which is easy enough to fake up with an external power supply and some TIP120's.

     c.  Learn to talk DMX.   It has been done on the Arduino, and there are even libraries.  It isn't trivial -- in several ways it is more messy than MIDI.  Of course, some packs you can talk to with MIDI as well but spitting a LOT of MSC (MIDI Show Control) at an ETC light board is a good way to crash the board and stop the show.  So, yeah...I'll pass on the odd MSC for a single effect, but trying to do a PWM-type effect that way would be even more stupid than wiring up my own TRIAC-based danger shield.


3.  And this one gets interesting.

I've already shown I can detect a punching motion (which is a possible "Tim the Enchanter" magic gesture.)  Not sure exactly how best to detect a snap of the fingers, if that's how the actor wants to go. 

Hold on.

Ouch ouch ouch.

Yes...I can detect the jarring motion of a snap of the fingers with an accelerometer mounted in a wrist band.  Or attached to a wrist with masking tape.  Ouch ouch ouch.

The tougher part would be discriminating -- better, specifying -- which of several DIFFERENT effects is to be triggered via pointing at them whilst gesturing.

The simplest solution is a compass.  Or, rather, a tilt-compensated triple-axis magnetometer.  Which aren't that expensive, but only talk I2C.  Which means I can't talk to those naked, either.  I'd need an AVR to read off the magnetometer chip, and then tell the XBee to transmit the signal to the interpreter.

Which in any case brings me squarely back to the DuckNode, and the Processing host software I have only begun to write.  (Current status?  I can select available ports on the fly, and recognize individual pin status).

Fortunately, #3 is completely stretch goal.  First I need to get that voice-activated light working on the breadboard.  Maybe I'll even dare some 120v relays -- just for a proof of concept, mind you.




Oh, and I realized something about the POV circuit.  The persistence of vision effect only takes place over 250 milliseconds.  But if a lighting effect is much brighter than the ambient light, you will also get afterimage.  Which can linger quite a bit longer.

In daylight, my POV circuit isn't particularly good.  In a darkened room, I "see" clearly a swath that crosses most of my body. 

So it is still plausible.  The next iteration I'm moving the LEDs closer together, find some way to diffuse them a little (they are a little too much brighter on-axis now), and increase their number.  I'm also finding that arbitrary patterns read better than words.  But at some point I really need to write a Processing routine that will turn a bitmapped image into a binary string, because I'm getting pretty tired of counting rows by hand as I manually type in 1's and 0's.

No comments:

Post a Comment