Saturday, March 2, 2013

Objective....Not Attained

I'm taking a break from Objective-C.  The "Hello World" tutorial I was following doesn't work.  Odd.  It is off the Apple site and should be compatible with the latest version of Xcode and Cocoa.  The tutorial included the final complete code and I copied that over to make an exact image -- and it still didn't work.

In the process of trying to figure out why, I learned a lot about working in Xcode, and about the language.  I think I grasp the concepts and syntax of Objective-C now, and am fairly confident in being able to write working code in it.  But the stumbling block is working within the Cocoa framework and in the Xcode development environment.

So I'm ducking back to Processing to move my DuckNode project forward.  Realized that I can model accelerometer data as a point in 3d space -- but it will require I re-learn some of my trig.  If I can plot a combined vector and a single acceleration value, I can construct "target" values as a 3d planar surface.  So the code would ask whether the measured acceleration is of sufficient magnitude to cross the plane, is of the correct vector to intersect the plane, and remains within or above threshold values for a preset interval.  This should be sufficient to discriminate a small number of simple motions.

The next software step, though (work incrementally, remember!) is to expand my current Quacker software to retrieve and display a real-time analog value from a DuckNode.

(Sure, I'm dreaming of more.  I want a piece of software that can display real-time the pin reads off the remote XBee, as well as any serial messages sent, AND pop up a standard AT mode terminal for on-the-fly reprogramming.  But that's all for development.  For the Quacker package, I want it to be third-party friendly, allow patching of multiple XBee receives to both MIDI events and direct software sound playback.)

(I'm still toying with LCARS, of course.  Even though that is a potential huge waste of time -- the interface doesn't need to be pretty, it only needs to work.  The LCARS methods are actually a pretty good match for Processing -- Processing loves doing color-changing blocks and lines of text -- but it isn't necessarily a great interface for connecting a remote sensor to options in a sound or lighting cue playback environment.)

I've also re-thought part of the DuckNode itself.  LiPo might still be necessary for power density -- I need to calculate the drain of not just active constant radio activity (the XBees are designed for low power use, but they achieve that by lowering data rates.  Unless I add an interactive layer for on-the-fly reprogramming of the XBee nodes, I'm stuck with having to run three + analog channels at a high data rate continuously for as long as the node is active.)  Oh, and the accelerometer is also a surprisingly high power drain.

In any case, although "permanently installed" LiPo and a charging jack makes sense for much remote sensing, for theater use it makes much more sense to design around removable batteries.  You don't want to be dependent on the LiPo charging cycle to have a safe margin going into the second show of a two-show day, or the tenth hour of a twelve-hour tech day.  You want to be able to put in fresh, trusted batteries at the top of the show.  And, for that matter, monitor them remotely.  In any case, what with wireless microphones, theaters have an existing infrastructure of AA batteries, often rechargables, and the ability to throw in a fresh pack if there is any doubt at all about the battery status of a show-critical bit of technology.

Mostly, though, I've been working a tough load-in at my local facility.  I'm sore and covered in splinters, and I haven't had the time or energy to do much more than glance at my programming books or scribble a few drawings of interface needs on a scrap of graph paper.

No comments:

Post a Comment