Wednesday, August 24, 2016

I am not left-handed

I'm taking a day or two off work to let my injured thumb heal. I needed it anyhow -- got a cold to get over as well, and I could use some uninterrupted time for prop work.

Amazing, though, how many things one uses a right thumb for. I've even tried to train my hands (I'm a self-taught touch-typist) to hit the spacebar with my left instead. But it is unusual enough to break my train of thought.

Anyhow.


I've three variant holocrons coming along. Still trying to problem-solve the lighting issues, though. My best thought is still to stick the circuit board at the bottom, likely within a defusing cube, and live with the bottom being unevenly illuminated.

I've been working with the Eagle file and if I omit the user buttons I can get everything within a 2x2 inch footprint. I've also started a new board (it is easier than editing the old one) that places the USB and LiPo circuits on one board and the CPU and lighting circuit on a second, with standoffs and a connector between them. Not sure it is worth doing, though.

And I still go back and forth on external USB jack. It does seem a little less magical to have files stored on it if you have to open it up and fumble with the circuit board containing the thumb drive every time you want to plug it in. Either way, though, I'm spoiling two faces in order to make that connection.

Oh, yeah. And Adafruit is currently back-ordered on the through-hole neopixels I wanted to try in the next revision. I'd solder my own WS2811-integrated RGB's to the face of the board except for two things; that would commit to 180 degrees lighting (dark spot below the circuit board) and according to the literature, there is an alarmingly high failure rate in doing reflow on these.

And if 180 degree lighting is acceptable -- then why not use the Cree I've been working with before? I love the intensity. Main drawback (besides the smaller illumination angle) is I'd have to run the charge circuit outside of the USB spec.

That's the thing, sadly. On many projects, all solutions are less than perfectly optimal. Sometimes, the choice is not for which is least sub-optimal, but for which is more satisfying from some extraneous aesthetic.

Sunday, August 21, 2016

Hol-oh-no

I stuck pictures up of my first production prototype and so far all the responses have been for a fully-assembled and wired Holocron.

This isn't what I got into the project for.

I wanted to make available a kit; a kit that had better aesthetics than the one kit I knew of, that assembled easier (not hard!) and that was if possible cheaper. The majority of work I've done on the mechanical design has been in aid of making it easier for other people to assemble one (if it was just me, I'd build a jig and slap glue over the pieces and there it would be).

Probably thing to do is to finish up the samples of the alternate designs and see if I can entice more people into getting the kit instead. Or deal; assembly is really not that onerous (after all, I did do all that work to make it easy!) and figure out how I'd have to price them to be worth it to me.




Oh, but the lighting circuit just isn't working. And there's worse. I just did an experiment. Faked up a box of translucent white 1/16" acrylic and stuck a 3W Cree in there. And it looks wonderful. I'm willing to deal with a dark spot on the bottom -- the holocron opens up there anyhow so the illusion suffers already at that angle -- so I can probably make this work with a circuit board lying on the bottom of a diffusor cube.

The cube shape is just barely glimpsed, but gives an incredible sense of depth and complexity..and almost tesseract-like effect, which can only be enhanced by detailing the cube with some black acrylic or vinyl decals.

The alternatives I have aren't wonderful. Assuming I can't come up with a simpler way to cover at least the 180 hemisphere, either through-hole Neopixels that can be bent outwards, or make my own mini circuit boards with right-angle headers.




Well, okay. I need to fix some issues with the files, laser off enough pieces to fix the magnet problem on the first production proto and complete the Imperial, Temple, and perhaps a second Stolen (mostly because I already cut most of the shell parts for one already). But I might not get to the laser this weekend; I also have to problem-solve this new idea of an inner cube, and how it gets attached.

Thursday, August 18, 2016

Holonought

The lighting circuit doesn't work.




Basically, the circuit board is just too damn big. Also, the current neopixels are a pain to work with.

So my current idea is to break up the board into two parts; the upper part being the CPU and LEDs, the lower part containing the USB host and Lithium Polymer battery charge circuit.


Also, there's something odd going on with the magnets. The lid seems to want to hover just slightly ajar. I think I need to move the magnets so they are attracting straight in instead of sideways. And I finally figured out how to do that. And it should even allow me to shrink the border slightly for a nicer (and more canonical) look.

I need to cut a new diffusion shell anyhow (to improve the snap-fit) so for my alternate "stolen" holo (which has narrow edges but is otherwise design compatible) I will do one without laser-engraved diffusion, and experiment to see if judicious sandpaper application can get more of the canonical look. Plus be a cheaper kit offering.

Meanwhile three other shells are in various stages of assembly and painting. Given the time to run off more diffusion pieces, I should end up with four holocrons lying around...


Sam I Am

I need another project -- or a story idea -- like I need a thing that is not needed. But try telling that to the Plot Bunnies.

Like the bunny that came this afternoon muttering, "No one leaves!" in archaic Japanese. I just can't seem to leave Yamatai alone. Well, here's the latest wild idea; leave Lara Croft home. So this becomes Samantha's story...and the rest of this meandering sketch-in-progress goes below the fold.


Monday, August 15, 2016

Physics of Sound : Addendum

aka "We're doomed, doomed." c.f. "Kids these days..."


In my previous essay I emphasized how real-world physical acoustics leaves fingerprints in recorded sound. For instance; record in your living room, and unless you smother it with excessive post-processing, anyone listening will know it was recorded in a living room. Which is fine, unless you meant for it to sound like it was recorded on a wind-swept moor.

The corollary is that acoustic physics can be the easiest way to load desired information into a recorded sound. Want a cue to sound like it is coming from an iPod speaker? Play it back on an iPod speaker. Or play it back on that speaker, record the result, and play that back! (Leaving aside whether placement in space is also desired for that particular effect).

However.

Your audience is increasingly not getting that necessary reference to the real acoustic world. They are increasingly surrounded by processed sound. By amplified sound, by reinforced sound, by manipulated sound, and more than anything else by recorded sound.

This is the latest serve in the volley between audience and sound designer. First one could be said to start back in the Mystery Plays. By the time of Opera and Vaudeville, a whole symbolic language had been built of artificial sounds, standing in for elements of the desired environment; mechanical effects from the slapstick to the thunder run and the wind machine.

This is a trend developed through the golden age of the radio play and the early sound films, advanced by creative directors like Hitchcock and Wells, and reaching fruition sometime in the 70's when film sound became a fully designed element; no longer thought of in terms of mere reproduction, but a canvas of substitution. Film sound has become akin to film editing in being a language the must be learned by the audience, until they accept without thinking that the cry of a red-tailed hawk means the mountain on screen (whether it is meant to be in Peru or on Barsoom) is tall and majestic.

A Hollywood gunshot or fist no longer sounds much like any "real" gun or fist, to the point at which the sound designer takes a risk in putting out a sound that goes against that programmed expectation. The otherwise unmemorable action film Blown Away went through expensive effort to record the actual sounds of explosives before test screenings forced them back into the stock, expected, "blowing on a microphone" effect that was itself a relic of earlier and more primitive microphone techniques.

The next volley is amplified music on stage and ADR dialog on film; an audience raised to expect the kind of pristine vocals and instrument reproduction possible in a studio (or with studio techniques laboriously introduced into every available cranny of production audio and married as seamlessly as possible with studio re-takes). The audience of 1940 heard mostly unreinforced voices on stage, even on the musical or in opera, from the pulpit and even from the podium and bandwagon. Now, reinforcement is omnipresent. And the vast majority of story and song that is delivered to the theater audience is outside of that still-acoustic space.

In short, the audience is used to hearing every syllable clearly, every finger pluck clearly. They don't have to pay attention, much less strain, when listening at home to radio or television or recording, and they aren't listening to unassisted voices in an acoustical environment in the movie house or concert stage. With rare exceptions.

And they have brought those expectations to live theater. They expect to hear dialog as crisply, and with as little effort on their own part, in that still-acoustic space. So the poor theatrical -- and even operatic -- sound designer is forced into ever more technologically sophisticated (and expensive) systems to reinforce and amplify and (usually less successfully) clarify.

So now we come to the last salvo. And that is an audience who spends a significant part of their waking life with earbuds in. They no longer have any first-hand experience with a physical acoustic environment. To them, the sonic cues that tell how far away a sound is, or how big a room is, are those created by designers -- by film and television sound designers, but even more frequently by game programmers.

Just as we can no longer trust our audience to understand an actual recorded gunshot -- how we need to present them with the fake, wrong, ersatz gunshot they expect -- we can no longer trust them to pick up environmental or physical acoustic clues that mimic or are taken from the real world. To them, increasingly, distance is reverb and a shout is merely volume.

We may, as designers, have to learn this new and artificial language instead if we wish to communicate with our younger audience.

But then, the way some trends are going, we might just put aside the microphones entirely and put the whole thing in the form of tweets.

Sunday, August 14, 2016

Yamatai II (fanfic thoughts)

I've written before about my issues with the reboot "Tomb Raider" (2013). I'm left with no firm idea of how one could have made it a better game. However -- and topical in that a movie is apparently about to enter production -- one can put aside questions of playability and game balance and ask only what would make a better narrative.

And this is another rambling ranting essay, so the rest is below the fold.


Essay: Worldization

The human brain is very good at picking up subtle audio cues; the little changes in phase and frequency content and direction that between them reveal the size and distance of an emitter and the size and surfaces of any enclosure around it.

These elements are difficult to fake, and nearly impossible to remove. If you record a voice-over session in a room, it will sound like it was recorded in a room. There is almost nothing you can do to remove those tell-tale clues.

Physicality matters. And physicality also sells. So as a sound designer, you can leverage that same physics.

One of the old tricks used in cinema was Worldizing. Basically, this meant taking pre-recorded material and playing it back in the same or similar acoustic space that was being shown on screen. Then record that. A similar trick has been used in record production; the most obvious being the "transistor radio" effect, achieved by -- yes -- playing back the track through a small speaker and picking that up on a mic.

You can perform this same trick live in the theater environment. If you have a sound effect that is pretending to be from an on-stage radio, then play it back through a small speaker. And place it as close as you can to where the prop is; again, that human audio processing system is uncannily good at figuring out where a sound is coming from in 3d space.

Of course, there are ways to fool that mechanism. One very useful trick for theater is the Precedence Effect. Simply put, the brain localizes on the first source heard (and/or, within a graph of intensity versus precedence, the loudest). So you can reinforce the sound of that small speaker to make it louder and fill in more of the low end content with other speakers, and as long as you stay within certain constraints of volume and time the sound will still "appear" where you physically placed the small speaker.

It isn't just speaker size. Placement matters. And so does the environment. If you place a speaker behind the set it will reverberate around the off-stage spaces and carry with it an aural "map" of that space. Put it on stage, within a defined space there (say, the couple of set walls that represent what the audience can see of a connected anteroom or bathroom, or inside a cabinet or coffin) and those acoustic spacial cues will be aded to the sound.



The simplest recording process is to record dry and add the appropriate ambiance later. However, there are times it makes sense to record within a specific acoustic space to begin with. Record in a stairwell and it will sound like a stairwell (or, at least, sound like a tall enclosed space). Interestingly, you can record on the actual stage and, if you've placed your microphone well (either in the audience, or near the speakers) on playback you will get phantom sources that seem to exist right there in the space with you. I did this once for Rosencranz and it was most effective.

On the flip side, you don't want every VO session to sound like the lobby, or every instrument you record at home to sound like your living room. Dampen those give-away reflections. I often record VO in costume shop storage, because all those hanging fabrics provides an acoustically dead space.



Physics appears in sound in other places. The vibration modes of any object -- not just a musical instrument -- change over various intensities. You can not record speaking and make it sound like shouting, or record a light tap and make it sound like a hard crunch. Or even record a piano played softly and make it sound like a piano being played vigorously. Physics doesn't allow it.

Again for voice-over work, if you want a voice to sound like it is twenty feet away, record from twenty feet away. Conversely, if you want it to sound like it is on a phone or a headset, then get that mic that close (or, better yet, find a phone or headset and record through that).

The latter is better because, once again, physics. You can simulate what a carbon element sitting in a phenolic handset sounds like, but you get a more accurate simulation with less work if you just record through that actual technology to begin with.

And this blends into sound effects. There is much more to be said on sound effects; about how audiences have been trained to expect things to sound a certain way (which they do not in real life), about how you need to focus in and strip down real sounds in order to "sell" them in the limited sonic window you have available in a play, and how distorting the real and creating the unreal are part of the art of sound design. But the best starting point is with real sounds. Not necessarily the sounds of that exact thing, mind you -- see above! -- but with real sounds. A microphone pointed at an actual mechanical object be it a snapping twig or a wind-up clock delivers multitudes of detail that is difficult to synthesize.

And real sounds have, well, reality. A gravitas, even. They carry that verisimilitude of real objects operating under the real physics we've instinctively absorbed through living in that physical environment. Even when you use a sound out of context, or use it to sell something quite different from its actual origin, those tiny cues of vibration nodes and damping and the little bits of noise of clattering and chattering and slithering and scraping are all there making it feel more real -- as well as more complex and more engaging.



Lastly, microphones have response curves and pick-up patterns. Equalization after the fact introduces phase shifts (as well as other artifacts); the better way to get a "bright" sound is to start with a "bright" mic. Real objects -- this is particularly obvious in musical instruments -- radiate from multiple sources in multiple directions. A microphone a few inches from a violin will hear a distinctly different sound picture if over the bridge, the neck, the back, or over an f-hole. And that same microphone a foot away will get yet another set of pictures depending on what part of the violin you aim for.

This is why placing microphones properly is so essential to getting the desired sound from a musical instrument that is being recorded (or reinforced in a live sound situation). The right mic, the right position, the right distance; these are all things that are difficult but not impossible to correct at the console or with plug-ins in the DAW. Which is not the same as saying post-processing never happens. There are instruments that are almost defined by artificial processing of the original acoustics, primary among them being the rock drum kit.

This matters for voice-over recording to. Or for foley work. It is essential in both to put on headphones and find out what is actually going to the recorder. Search out the sound field, move the mic and change the angle, to find the sound you are searching for.