Delay() is to Wiring what GOTO is to BASIC.
Sure, for a three-line sketch that blinks an LED, there's nothing wrong with using delay() to control the timing.
But for anything that involves human interaction or sensor data, every moment spent in delay() is a moment the device won't be able to respond to inputs.
Well, unless you are using interrupts. Which is fine and dandy, but if you've mastered interrupts, then use those same timers instead of adding delay() statements.
About half of the sketches I'm writing these days are in one way or another simulating analog electronics. The Morrow Project CBR Kit, for instance, had multiple places where it would blink a light two or three times, or display one word on the display for a second or two then switch to another. Basically trivial in analog electronics (which is what it was simulating), but challenging in a single-threaded micro environment.
The trick is to make something like a real-time operating system. Instead of spending 9/10ths of the clock cycles sitting in a wait loop, you emulate a counter in code. The code runs continuously, passing through the sensor and button checks and looking for new serial input and so forth (and running such things as software PWM), and when the counter reaches certain preset values, actions are taken.
For a "flash three times," then, my pseudo-code might be something like:
counter ++;
if (counter % 2 && counter < 8)
{LED on:}
else
{LED off;}
Or you can control the timing more closely with a more long-winded, harder to maintain bit of code;
if (counter == 100 || counter == 200 || counter == 300)
{LED on;}
if (counter == 120 || counter == 220 || counter == 320)
{LED off;}
The latter produces a short pulse for each blink.
Of course you can always add a second counter just to control the number of flashes;
counter ++;
flashes ++;
(if counter % 100 && flashes <= 3)
{toggle LED;}
These sorts of strategies allow you to have multiple events which appear free-running, with individual timing; indicators blinking, pulsing, doing patterns, all at the same time.
I will add; I'm using delayMicroseconds() a bit. Reason is, I've been writing for the ATtinys, and doing things like software PWM where the timing of the code is actually dependent on the actual execution of the opcodes. So there are a few delayMicro statements in there to balance different sections of code so the machine spends a roughly equivalent time on both ends of the PWM loop.
It is also often very useful to delay just a wee bit after a serial data receive, and for some kinds of sensor data. The point is; these delays are happening at machine scale, and are taking on the order of tens or hundreds of clock cycles -- little more than an average bit of actual code. The delays I was talking about in my introduction are vastly longer; up into multiple seconds.
No comments:
Post a Comment