Cobbled up a script light using one of the DuckLight prototype boards. Made a flexible stand with an eight inch length of armature wire from the art supply store, and the neodymium magnet from a 1" tweeter as a base. I'm having odd issues with the software, though -- there's a noticeable flicker, and I'm pretty sure the chip is running at 8 mHz. This is something I need to solve to make the circuit truly useful. But at least I can see my cues on my current show.
I'm also re-thinking the button scheme once again, in that if I can get the software working on the Holocron, I think it might just provide enough control for many of the "quick, find a light to put in this prop" applications. It is a lot of programming and will be fiddly to use but with three buttons I should be able to select color and among several nodes; steady, pulse, flash, swirl, and flicker.
Adding the option for interactivity is harder. But...I just realized I can add another mode by polling the buttons and taking any voltage outside the range the programming buttons supply as a valid input. So it might be simple to add a one-shot trigger mode. Less simple to add an intensity control (unless I've got another analog input lying around I haven't remembered).
Related to this, a family of megahertz-range radio modules has just had a decent software and hardware infrastructure built for them, so there is an option now that is cheaper and has a better transmission range than the XBee modules I've been using up until now.
Managed to take the class on the Universal Laser and can now laser-cut and engrave at TechShop again. And sold off the last of my stock of Aliens grenades (I still have a few mostly-finished bodies around). The show I'm working closes next week and I'm really looking forward to having no theater for a while and being able to get more prop work done.
Tricks of the trade, discussion of design principles, and musings and rants about theater from a working theater technician/designer.
Monday, May 30, 2016
Rice Surprise
One of the tricks of engineering is to know what you need to be looking for. The entire Holocron project has been an exercise in belatedly realizing one thing after another that needed to be researched and calculated and worked out.
I was so sure I was on track to start slapping down components. Then while I was re-reading the pdf on the lithium polymer charge monitor chip I'll be using (a cute little SOT-23 package), I realized the programmed pattern of conditioning charge and monitor before delivering full charge, and the taper off to maintenance charge, were all going to play badly with the requirement of the rest of the circuit to suck power for the LEDs.
It took only a moment of googling to discover this was properly called a "Load Sharing" problem, and articles had been written and circuits proposed to deal with it. Meaning I need to add more components to the board. And as I discovered when I checked my Eagle libraries, meaning I needed to source the parts, get their footprints into Eagle...
(I also worked for a while trying to make the footprint work with the through-hole neopixels I'd found. Looked for a while at SMD versions, but couldn't figure out how to properly surface-mount on the back side of the PCB. Looked for a while for pre-wired strings and finally decided on a minimal breakout board that could be easily assembled into a string with ribbon cable. Once I've made my final decision on a supplier for that, too!)
And then there's times that empiricism is close enough. I've done so much trick-line rigging, I knew before I walked into the theater I could use span-sets (rather, some slings from my rock-climbing collection) as supports and quick-links as pulleys. I also knew I'd need a little weight to drag the line back down to the ground after we'd put it away the previous night. So with about twenty minutes to go I finally got around to stitching up a tiny sand-bag. Which I filled with....rice from the kitchen.
I don't call it impossible that there might be some strange interaction between rice, nylon, canvas, the steel of the old building, or whatever. Or some unusual factor like hungry bats or a lightning strike. If I was doing this rigging on a ten million dollar satellite you'd bet I'd check every wacky suspicion that occurred to me. But for this, for a two-week run of a lightly-attended show in a black box theater, none of the plausible failure modes are of high enough probability to worry about.
It's the holocron -- which will be available both as kit and fully assembled board to purchasers -- that I have to get it right. And more and more I think that with the first one I just plain got lucky.
I was so sure I was on track to start slapping down components. Then while I was re-reading the pdf on the lithium polymer charge monitor chip I'll be using (a cute little SOT-23 package), I realized the programmed pattern of conditioning charge and monitor before delivering full charge, and the taper off to maintenance charge, were all going to play badly with the requirement of the rest of the circuit to suck power for the LEDs.
It took only a moment of googling to discover this was properly called a "Load Sharing" problem, and articles had been written and circuits proposed to deal with it. Meaning I need to add more components to the board. And as I discovered when I checked my Eagle libraries, meaning I needed to source the parts, get their footprints into Eagle...
(I also worked for a while trying to make the footprint work with the through-hole neopixels I'd found. Looked for a while at SMD versions, but couldn't figure out how to properly surface-mount on the back side of the PCB. Looked for a while for pre-wired strings and finally decided on a minimal breakout board that could be easily assembled into a string with ribbon cable. Once I've made my final decision on a supplier for that, too!)
And then there's times that empiricism is close enough. I've done so much trick-line rigging, I knew before I walked into the theater I could use span-sets (rather, some slings from my rock-climbing collection) as supports and quick-links as pulleys. I also knew I'd need a little weight to drag the line back down to the ground after we'd put it away the previous night. So with about twenty minutes to go I finally got around to stitching up a tiny sand-bag. Which I filled with....rice from the kitchen.
I don't call it impossible that there might be some strange interaction between rice, nylon, canvas, the steel of the old building, or whatever. Or some unusual factor like hungry bats or a lightning strike. If I was doing this rigging on a ten million dollar satellite you'd bet I'd check every wacky suspicion that occurred to me. But for this, for a two-week run of a lightly-attended show in a black box theater, none of the plausible failure modes are of high enough probability to worry about.
It's the holocron -- which will be available both as kit and fully assembled board to purchasers -- that I have to get it right. And more and more I think that with the first one I just plain got lucky.
Saturday, May 28, 2016
Canadian Ninja
(Title in reference to the Michael Dudikoff action film from the mid-80's).
Friend of mine is refereeing a "Morrow Project" campaign. Among the many armed groups prowling the post-nuclear-war wastelands are, he decided, at least some who have kept alive certain martial arts traditions. (Even if they have no memory left of the ninja craze that swept the west around the 80's.) When he brought this up, one of the players (presumably facetiously) suggested that the groups "up North" might use throwing stars in the shape of the maple leaf on the Canadian flag.
How canon this eventually became within the campaign, I do not know. But that suggestion called for a prop.
Friend of mine is refereeing a "Morrow Project" campaign. Among the many armed groups prowling the post-nuclear-war wastelands are, he decided, at least some who have kept alive certain martial arts traditions. (Even if they have no memory left of the ninja craze that swept the west around the 80's.) When he brought this up, one of the players (presumably facetiously) suggested that the groups "up North" might use throwing stars in the shape of the maple leaf on the Canadian flag.
How canon this eventually became within the campaign, I do not know. But that suggestion called for a prop.
You might call it a "Throwing Maple." Or a "Ninja Leaf." Maybe "Maplekan?" Or my favorite (very subtle), "Shurican." This is a practical prop, in the sense that it is metal and will stick in a tree (or at least a fence).
However. This is not knife-grade steel; it was cut from an outlet box cover, 16-gauge galvanized steel of some ferromagnetic but basically cheap low-strength alloy. It also only looks sharpened. I spent longer researching California Law then I did cutting it out, with the result that is is completely blunt and suitable only for display use and thus fails to meet two of the essential criteria.
This was a quick prop. Found a maple leaf image, printed it out at 4" across and used Spray 77 to stick the print directly to the metal (an outlet box cover from OSH, cost about a buck.) Cut out with a jigsaw with medium metal-cutting blade, then smoothed with hand file. The longest part of the build was making the bevel; that was achieved almost entirely with hand filing, via an old set of needle files.
My original intent was to gun blue or even parkerize. The blue wouldn't take and parkerizing is a little too scary to get into right now. So spray paint. I figured I could go over the edges with a file or emery paper to bring them back to naked steel. Well, yes, if I wanted to spend a heck of a lot longer. After getting the paint all scratched up, I knocked it back with quick passes of the emery paper (aka wet-dry 220 grit sand paper) and did the black with a DecoColor paint marker instead. Then, since it looked banged-up already, dipped the whole thing in Birchwood Casey Plum Brown (aka insta-rust) and buffed/weathered it with steel wool to give it that "been carrying this throwing star around in a pocket of my black gi for a few days" look.
Sunday, May 22, 2016
The Internet of (Theater) Things
I was at Maker Faire this weekend, and like so many other parts of the inter-tubes it was all IOT, IOT, IOT. And I'm wondering if it is time to rethink the usual objection to using Wifi in a performance context.
The old assumptions were that wifi wasn't reliable enough for performance. But then, we used to assume computers weren't reliable enough. I have seen the pleasant glow of the Blue Screen of Death from a sound booth or two, but I ran into many, many more people over the years who insisted on using tape, cassettes, or (eventually) CD's for effects playback because they didn't trust that a computer wouldn't break on them.
Well, I think most people have moved past that. Computers have become the default for sound playback as well as video playback. Crashes still occur but they tend to get ironed out in Tech; the show-stopper failures I have personally observed have been due to the batteries running dry on a production laptop.
I've seen a slow movement towards acceptance of iPad links to sound boards. I remember when the Yamaha board was iffy at best, but now that Wifi link is considered reliable enough for use in the time-critical environment of Tech and Sound Check on a professional-level show.
(On the other side of the technology-adoption bell curve, I've personally run numerous productions with laptop and software tools due to not having the budget for rack-mount equipment. Sub-mixed drums for one musical on the laptop, and that worked well enough that on a later production I few all of the sound reinforcement through it, using freeware plug-ins within Reaper to achieve a graphic equalizer for the house speakers. My last show, I was running sound effects, projections, and even running lights from the one laptop. And I've seen a lot of this sort of exercise in similar improvised micro-budget shows.)
After all, some of the oldest documented theater technology was borrowed and adopted methods from Elizabethan-era sailors; rope rigging, counterweights, whistle codes. It's a natural path from there to modern techs using cell phones to communicate to back stage (instead of trying to come up with the money for the old-school hardwired headset systems). And a lot of people are using DAWs for sound manipulation or MIDI hosts for live keyboards, or (again on the other side of the technological bell curve) personal music players or similar software like iTunes for backing track and effects playback.
In fact, at some levels of theater it is considered ordinary and natural to plug an MP3 player into the sound board and hit one of those little fiddly buttons at the exact instant called for in the script. (Putting the sounds on the hard disk of a computer with sound playback software that was specifically written for performance use is the more reliable alternative now!)
Which brings us to a segue. Effects -- or more broadly, all the possibilities of both the established ideas of theatrical lighting properties and scenery, and the less widely accepted ideas of interactive technology -- are not exactly called for in the script. One even suspects that in the golden age of musicals and old chestnut standards, the Annie and the You Can't Take it With You, the writer brought the same awareness of what could be practically done with multiple settings in scenery and quick-changes in costumes to what could be achieved in the way of on-stage telephone calls and so forth.
Which is to say, most shows don't require really clever technology to get a sound or lighting effect to happen in the right spot. Between the way the script makes the timing of the effect non-critical, and the way the presentational aspect of the box set with the missing fourth wall et al makes playing a telephone sound out of a speaker appropriate and sufficient (or at least sufficiently appropriate), there isn't a need for something more elaborate.
At least, not there. When you get to more modern works, and better yet, to those experimental works that straddle worlds of dance, improvisation, performance art, et al, there are plenty of spaces to explore more complex interactions than "play back a sound effect at a specific moment in the dialogue."
Of course, these also tend to get worked out in development. Sometimes I have had a performer or a puppeteer or a musician or whatever come up to me and ask, "I'd like to have this happen when I do this; is it possible?" But mostly, a choreographer or a props person or someone sees something interesting that's already out there in the world, and the performance is designed around how that existing thing functions; designed to accommodate the existing advantages and the existing flaws.
(Specific case in point; in a production of Wizard of Oz we used light-up globes extensively. Each time these were brought out, there was specific choreography to allow each performer to turn their back to the audience and page through all the available colors in these off-the-shelf devices until they got to the desired effect for that moment in the show.)
So it seems to me that the process of using the kind of effect modern electronics makes possible starts from the Designer. Instead of problem-solving something that the rest of the design team would like to happen, you become the one to suggest an effect. Which means you are effectively working out of what already exists or is known to be possible, rather than working from something that wants to happen on stage and developing a solution to it.
For that reason among others, I'm not that interested in the experimental end -- in the kind of process that puts accelerometer-controlled LED strings on a dancer, or whatever. Because as I pointed out above, this is more a process of adoption than design. You more or less start with available consumer products, and you develop a way to use them over rehearsal.
My interest from pretty much when I started using electronics in theater (where the height of my technological output was sticking leaf switches around a motor-driven cam to create a hardware string light chaser), is to, well, I'd call it "sweetening."
Here's a conceptual framework from another industry. A movie is more-or-less filmed MOS. In the older days this was a technical necessity, these days it is an artistic choice. Dialog may be taken from the shoot, but the totality of the sound environment in the finished product is a created thing. This is for focus and nuance; the noise is stripped away, and the only sounds still there are those that tell the story -- and they are pushed, too, for artistic nuance and emotional effect.
And this process has already begun in theater. We can't pan, zoom, or cut; we don't have that control over what the play-goer watches. But we do control lighting and we build and even paint scenery to place the eyes and the attention where we want it and to make essential story-telling and emotional points.
And artificial sound has entered. Even in smaller spaces, even in opera, subtle reinforcement and other acoustic shaping is already taking place. In larger houses and in the musical dialog is already passed through processing to make it larger than life in the same way Hollywood takes the best the boom mics and hidden lapel mics can carry back from the live stage, marries those with ADR done in the studio, and presents the final honed and processed mix to the audience.
At the very simplest level, I think we can now have the sound issue naturally from an on-stage walkie-talkie or phonograph or bugle (or appear to), and we can have the light from a television or cell phone (or the lights of other items of technology) doing what seems natural for them to do. The history of technical theater is full of examples of sticking colored lights in empty TV cabinets and sticking speakers under chairs and otherwise producing these illusions. Well, we can do them better now -- even if that means just pushing actual video out through a length of VGA.
But at the more complex level, I think we can have a gunshot sound "right." I think we can have a sword fight with the exiting (and thoroughly fake) sword sounds of a movie. And more subtly, I think we can treat voices and footsteps so the actors sound like they are walking a marble hall instead of wooden platforming.
But many of these require wireless control. Props move. Actors move even more. Cheap wireless is one of the necessary parts to make it possible to bring this sort of realized environment, this sort of naturalistic (or hyper-naturalistic) stage environment. And it may be that Wifi and the IOT has reached a point of maturity where it can be trusted in a production context. And not just on experimental theater pieces with fourteen people in the audience, but in staid professional theaters where your equipment breakdown is seen not by mostly your own circle of friends, but by hundreds of people paying thirty-five bucks a seat.
The old assumptions were that wifi wasn't reliable enough for performance. But then, we used to assume computers weren't reliable enough. I have seen the pleasant glow of the Blue Screen of Death from a sound booth or two, but I ran into many, many more people over the years who insisted on using tape, cassettes, or (eventually) CD's for effects playback because they didn't trust that a computer wouldn't break on them.
Well, I think most people have moved past that. Computers have become the default for sound playback as well as video playback. Crashes still occur but they tend to get ironed out in Tech; the show-stopper failures I have personally observed have been due to the batteries running dry on a production laptop.
I've seen a slow movement towards acceptance of iPad links to sound boards. I remember when the Yamaha board was iffy at best, but now that Wifi link is considered reliable enough for use in the time-critical environment of Tech and Sound Check on a professional-level show.
(On the other side of the technology-adoption bell curve, I've personally run numerous productions with laptop and software tools due to not having the budget for rack-mount equipment. Sub-mixed drums for one musical on the laptop, and that worked well enough that on a later production I few all of the sound reinforcement through it, using freeware plug-ins within Reaper to achieve a graphic equalizer for the house speakers. My last show, I was running sound effects, projections, and even running lights from the one laptop. And I've seen a lot of this sort of exercise in similar improvised micro-budget shows.)
After all, some of the oldest documented theater technology was borrowed and adopted methods from Elizabethan-era sailors; rope rigging, counterweights, whistle codes. It's a natural path from there to modern techs using cell phones to communicate to back stage (instead of trying to come up with the money for the old-school hardwired headset systems). And a lot of people are using DAWs for sound manipulation or MIDI hosts for live keyboards, or (again on the other side of the technological bell curve) personal music players or similar software like iTunes for backing track and effects playback.
In fact, at some levels of theater it is considered ordinary and natural to plug an MP3 player into the sound board and hit one of those little fiddly buttons at the exact instant called for in the script. (Putting the sounds on the hard disk of a computer with sound playback software that was specifically written for performance use is the more reliable alternative now!)
Which brings us to a segue. Effects -- or more broadly, all the possibilities of both the established ideas of theatrical lighting properties and scenery, and the less widely accepted ideas of interactive technology -- are not exactly called for in the script. One even suspects that in the golden age of musicals and old chestnut standards, the Annie and the You Can't Take it With You, the writer brought the same awareness of what could be practically done with multiple settings in scenery and quick-changes in costumes to what could be achieved in the way of on-stage telephone calls and so forth.
Which is to say, most shows don't require really clever technology to get a sound or lighting effect to happen in the right spot. Between the way the script makes the timing of the effect non-critical, and the way the presentational aspect of the box set with the missing fourth wall et al makes playing a telephone sound out of a speaker appropriate and sufficient (or at least sufficiently appropriate), there isn't a need for something more elaborate.
At least, not there. When you get to more modern works, and better yet, to those experimental works that straddle worlds of dance, improvisation, performance art, et al, there are plenty of spaces to explore more complex interactions than "play back a sound effect at a specific moment in the dialogue."
Of course, these also tend to get worked out in development. Sometimes I have had a performer or a puppeteer or a musician or whatever come up to me and ask, "I'd like to have this happen when I do this; is it possible?" But mostly, a choreographer or a props person or someone sees something interesting that's already out there in the world, and the performance is designed around how that existing thing functions; designed to accommodate the existing advantages and the existing flaws.
(Specific case in point; in a production of Wizard of Oz we used light-up globes extensively. Each time these were brought out, there was specific choreography to allow each performer to turn their back to the audience and page through all the available colors in these off-the-shelf devices until they got to the desired effect for that moment in the show.)
So it seems to me that the process of using the kind of effect modern electronics makes possible starts from the Designer. Instead of problem-solving something that the rest of the design team would like to happen, you become the one to suggest an effect. Which means you are effectively working out of what already exists or is known to be possible, rather than working from something that wants to happen on stage and developing a solution to it.
For that reason among others, I'm not that interested in the experimental end -- in the kind of process that puts accelerometer-controlled LED strings on a dancer, or whatever. Because as I pointed out above, this is more a process of adoption than design. You more or less start with available consumer products, and you develop a way to use them over rehearsal.
My interest from pretty much when I started using electronics in theater (where the height of my technological output was sticking leaf switches around a motor-driven cam to create a hardware string light chaser), is to, well, I'd call it "sweetening."
Here's a conceptual framework from another industry. A movie is more-or-less filmed MOS. In the older days this was a technical necessity, these days it is an artistic choice. Dialog may be taken from the shoot, but the totality of the sound environment in the finished product is a created thing. This is for focus and nuance; the noise is stripped away, and the only sounds still there are those that tell the story -- and they are pushed, too, for artistic nuance and emotional effect.
And this process has already begun in theater. We can't pan, zoom, or cut; we don't have that control over what the play-goer watches. But we do control lighting and we build and even paint scenery to place the eyes and the attention where we want it and to make essential story-telling and emotional points.
And artificial sound has entered. Even in smaller spaces, even in opera, subtle reinforcement and other acoustic shaping is already taking place. In larger houses and in the musical dialog is already passed through processing to make it larger than life in the same way Hollywood takes the best the boom mics and hidden lapel mics can carry back from the live stage, marries those with ADR done in the studio, and presents the final honed and processed mix to the audience.
At the very simplest level, I think we can now have the sound issue naturally from an on-stage walkie-talkie or phonograph or bugle (or appear to), and we can have the light from a television or cell phone (or the lights of other items of technology) doing what seems natural for them to do. The history of technical theater is full of examples of sticking colored lights in empty TV cabinets and sticking speakers under chairs and otherwise producing these illusions. Well, we can do them better now -- even if that means just pushing actual video out through a length of VGA.
But at the more complex level, I think we can have a gunshot sound "right." I think we can have a sword fight with the exiting (and thoroughly fake) sword sounds of a movie. And more subtly, I think we can treat voices and footsteps so the actors sound like they are walking a marble hall instead of wooden platforming.
But many of these require wireless control. Props move. Actors move even more. Cheap wireless is one of the necessary parts to make it possible to bring this sort of realized environment, this sort of naturalistic (or hyper-naturalistic) stage environment. And it may be that Wifi and the IOT has reached a point of maturity where it can be trusted in a production context. And not just on experimental theater pieces with fourteen people in the audience, but in staid professional theaters where your equipment breakdown is seen not by mostly your own circle of friends, but by hundreds of people paying thirty-five bucks a seat.
Saturday, May 7, 2016
Shut up and take my money
New idea: post when I've actually got something done. To paraphrase Blaise Pascal, I'd be writing shorter blog posts, but I just haven't had time.
Opened another minimal show. Borrowed lights running on eBay dimmers, video projector on the end of 100' of cheap Amazon VGA, borrowed Yamaha powered speakers on OnStage stands. USB to DMX converter box, HDMI to VGA converter cable, and everything is running off my little laptop. Which is a bit painful. I'm using QLab for audio and some of the video playback, QLC+ for lighting control, and VLC to play back the movies at the end of the show.
The last wrinkle is a Korg nanokey wired as a QLab "Go" button so I can have QLC+ in the foreground to adjust lighting looks while controlling sound and video in the background; as I've mentioned before, MIDI control bypasses the desktop "focus" problem and is always live.
(No -- resizing the windows isn't enough for this setup. Not to get the cues happening smoothly and simultaneously.)
The one thing I really should have done earlier was the Cue Sheets I finally printed up. Those are a lot easier to read than the crabbed notes I made during the hurried, harried tech (confused clients who kept changing their minds, equipment I hadn't quite gotten working yet, and only twenty minutes for each of a large number of different groups with different needs).
The other thing I wanted for this show was a Video license for QLab. Back when I started using QLab 1.0 I purchased the Pro Audio and MIDI licenses but not, alas, the Video. Factor 53 won't sell those any more. They are even deprecating QLab 2.0 -- they want everyone to use QLab 3.0 (Which is a much larger footprint, only runs on the latest OS, and costs approx 30x as much).
Well, QLab 2.0 wouldn't recognize my video hardware, and QLab 3.0 didn't even recognize my audio hardware. It did offer that perhaps if I bought yet another license on top of the two I had already I might able to open the menu that would allow me to try to select my USB audio interface, but there was no assurance it would handle the video any better than 2.0 had.
So I contacted the company for a refund and dropped back to QLab 1.0 Seriously, I don't care what new functionality the new software offers, if it can't smoothly handle the basics then I'd much rather they'd just the heck let me pay for the version that can.
In a somewhat similar mode, I've been using a free government phone. Try as I might I couldn't figure out how to tell them I now had a full-time job, wasn't eligible, and wanted to switch to paying for it. So I sat on the renewal applications, let the current service term run out, and the next time I tried to use the phone it sent me to a menu that gave me the option to "top off" from credit card. Good enough!
Collapsed in exhaustion after the first tech and took one and a half days off work. And got yelled at, but not for that. But now is the weekend and show is up and aside from needing to put aside an hour or two to try to transcribe more of my notes from Tech into human-readable output, I have a free day, I've had some sleep, and I have quixotic hopes of actually getting some work done on my CAD.
Oh, yes, and I did spare the CPU cycles to put a new Tomb Raider/SG1 chapter up. Teal'c explores Croft Manor, Alister turns up at the indoor shooting range with a load of peculiar historical guns from Lord Croft's collection, Zip flirts with Carter over Skype...and Daniel and Alister have a marathon bull session touching on everything from the Narmer Pallet to the Kon-Tiki expedition.
Opened another minimal show. Borrowed lights running on eBay dimmers, video projector on the end of 100' of cheap Amazon VGA, borrowed Yamaha powered speakers on OnStage stands. USB to DMX converter box, HDMI to VGA converter cable, and everything is running off my little laptop. Which is a bit painful. I'm using QLab for audio and some of the video playback, QLC+ for lighting control, and VLC to play back the movies at the end of the show.
The last wrinkle is a Korg nanokey wired as a QLab "Go" button so I can have QLC+ in the foreground to adjust lighting looks while controlling sound and video in the background; as I've mentioned before, MIDI control bypasses the desktop "focus" problem and is always live.
(No -- resizing the windows isn't enough for this setup. Not to get the cues happening smoothly and simultaneously.)
The one thing I really should have done earlier was the Cue Sheets I finally printed up. Those are a lot easier to read than the crabbed notes I made during the hurried, harried tech (confused clients who kept changing their minds, equipment I hadn't quite gotten working yet, and only twenty minutes for each of a large number of different groups with different needs).
The other thing I wanted for this show was a Video license for QLab. Back when I started using QLab 1.0 I purchased the Pro Audio and MIDI licenses but not, alas, the Video. Factor 53 won't sell those any more. They are even deprecating QLab 2.0 -- they want everyone to use QLab 3.0 (Which is a much larger footprint, only runs on the latest OS, and costs approx 30x as much).
Well, QLab 2.0 wouldn't recognize my video hardware, and QLab 3.0 didn't even recognize my audio hardware. It did offer that perhaps if I bought yet another license on top of the two I had already I might able to open the menu that would allow me to try to select my USB audio interface, but there was no assurance it would handle the video any better than 2.0 had.
So I contacted the company for a refund and dropped back to QLab 1.0 Seriously, I don't care what new functionality the new software offers, if it can't smoothly handle the basics then I'd much rather they'd just the heck let me pay for the version that can.
In a somewhat similar mode, I've been using a free government phone. Try as I might I couldn't figure out how to tell them I now had a full-time job, wasn't eligible, and wanted to switch to paying for it. So I sat on the renewal applications, let the current service term run out, and the next time I tried to use the phone it sent me to a menu that gave me the option to "top off" from credit card. Good enough!
Collapsed in exhaustion after the first tech and took one and a half days off work. And got yelled at, but not for that. But now is the weekend and show is up and aside from needing to put aside an hour or two to try to transcribe more of my notes from Tech into human-readable output, I have a free day, I've had some sleep, and I have quixotic hopes of actually getting some work done on my CAD.
Oh, yes, and I did spare the CPU cycles to put a new Tomb Raider/SG1 chapter up. Teal'c explores Croft Manor, Alister turns up at the indoor shooting range with a load of peculiar historical guns from Lord Croft's collection, Zip flirts with Carter over Skype...and Daniel and Alister have a marathon bull session touching on everything from the Narmer Pallet to the Kon-Tiki expedition.
Saturday, April 23, 2016
Pathetic Fallacy
I fired up Tomb Raider 2013 recently just to play the hunt-for-your-food sequence again (and try out some DLC -- like a warm jacket, finally!) And I couldn't help noticing this time around that no matter how long you spend wandering around the woods, the rain starts the very second you shoot a deer.
But I've also been playing other games, and reading reviews, and a lot of what impressed me earlier no longer impresses me. There is the core of a nice little story there and the voice acting and motion capture support it well. But ninety percent of the game is a stock first-person shooter with stock mechanics, graphics tricks, game assets, character AI, etc. As nice as some of the shrines and other scenery are, the majority of the art assets are the same tired variations of room full of boxes and cluttered alley between cookie-cutter buildings.
The new game, Rise of the Tomb Raider, ups the graphics, adds a little more variety to the combat options and improves the crafting system, but basically is the same routine. Of which an absurd amount is still the barely-interactive scripted sequences.
Really, what happened with games? So many of them are striving for spectacle. Sure, with modern graphics cards you can do spectacle, but Hollywood can do it even better. The peculiar strength of gaming is that it is interactive. And spectacular action sequences that force the player to be an almost completely passive viewer are not playing to this strength.
I've said this before. There's one sequence in Tomb Raider 2013 where a scared (but determined) Lara has to climb to the top of a rusted, shaky, and very tall radio tower in order to send out a distress call. On the first play through, this is nail-biting, seat-of-the-pants scary. But on a second play, a terrible truth becomes obvious; the entire sequence is so tightly scripted you can not fall even if you try. In fact, the only action you as a player ever take over the entire five-plus minutes of this sequence is to hold down the "go forward" key.
And so many games do this. They put in pre-rendered cutscenes. They put in quicktime events (which stab themselves in the back, as they force the player to not get involved in the spectacle but instead focus narrowly on whatever symbol has popped up that requires that corresponding key to be hit). They don't even make it possible (in far too many cases) to skip through this junk on a second play-through. So they sacrifice playability in that way, too; they force the gamer to do things that aren't interesting (like waiting through a Quicktime Event) instead of letting them, well, play.
Interact. Be involved in the material. Be immersed, in those ways that games permit and movies do not.
I also just played one of the old Call of Duty games -- a World War II setting, in keeping with my current interest in history. And the biggest problem I have with this game is a similar one to that which I have with Tomb Raider 2013. I want it to be more about the purported subject, and less about generic mechanics.
Not that I think this would be easy to achieve. Or even necessarily sell well. Call of Duty is very much a "twitch" game. Now, it does focus on events -- such as the Normandy Landings -- which were incredibly fast-moving and chaotic. But I've taken part in military exercises and outside of the last moments of a banzai charge the pace is a little slower.
The first sequence, for instance, places you as a Soviet peasant conscripted into the defense of Stalingrad. It actually frames pretty well, with such cute bits as having you practice how to throw a grenade with a bucket of potatoes, that the Soviet army is poorly equipped. But then combat begins, and for all intents and purposes the only reason to ever conserve ammunition is because the reloading animation takes so long. Really, like practically every other first-person shooter, you are encouraged to hose the landscape.
I made a point of going through even large parts of the Normandy sequence with a rifle, and choosing to look through the sights rather than firing through the hip. And this slowed down the breakneck pace just a little, but it is still far from a realistic experience.
And, yes, there are nice models of appropriate settings, uniforms, equipment, There are little set-ups in film reel style, and short diary entries. Just enough to where I did sort of get the sense of being a British soldier at El Alamein (or whatever). But really the mechanics trump any need to pay attention to specific details; grab any weapon you see on the battlefield and pull the trigger whenever the cross-hairs turn red, run in the general direction of the big arrow and keep shooting until the next cutscene begins.
Now, I'm not asking to have to study a topo map and strain to understand static-swamped radio messages in order to figure out the next objective, any more than I'm asking to have to spend three hours scraping a one-meter grid with a trowel to find the next artifact in Tomb Raider. But I think there's room for a lot more context.
And I think the standard model of the first-person shooter was sufficiently exercised by the time Doom II came out. Playing as an Army Ranger at the cliffs of Point du Hoc should not be essentially identical to the experience of playing as a young archaeologist shipwrecked on an island filled with savage cultists and an ancient mystery. Let's not be afraid to tinker a bit. Especially, lets find a way to support game length other than spawning a truly ridiculous number of essentially-identical targets.
But I've also been playing other games, and reading reviews, and a lot of what impressed me earlier no longer impresses me. There is the core of a nice little story there and the voice acting and motion capture support it well. But ninety percent of the game is a stock first-person shooter with stock mechanics, graphics tricks, game assets, character AI, etc. As nice as some of the shrines and other scenery are, the majority of the art assets are the same tired variations of room full of boxes and cluttered alley between cookie-cutter buildings.
The new game, Rise of the Tomb Raider, ups the graphics, adds a little more variety to the combat options and improves the crafting system, but basically is the same routine. Of which an absurd amount is still the barely-interactive scripted sequences.
Really, what happened with games? So many of them are striving for spectacle. Sure, with modern graphics cards you can do spectacle, but Hollywood can do it even better. The peculiar strength of gaming is that it is interactive. And spectacular action sequences that force the player to be an almost completely passive viewer are not playing to this strength.
I've said this before. There's one sequence in Tomb Raider 2013 where a scared (but determined) Lara has to climb to the top of a rusted, shaky, and very tall radio tower in order to send out a distress call. On the first play through, this is nail-biting, seat-of-the-pants scary. But on a second play, a terrible truth becomes obvious; the entire sequence is so tightly scripted you can not fall even if you try. In fact, the only action you as a player ever take over the entire five-plus minutes of this sequence is to hold down the "go forward" key.
And so many games do this. They put in pre-rendered cutscenes. They put in quicktime events (which stab themselves in the back, as they force the player to not get involved in the spectacle but instead focus narrowly on whatever symbol has popped up that requires that corresponding key to be hit). They don't even make it possible (in far too many cases) to skip through this junk on a second play-through. So they sacrifice playability in that way, too; they force the gamer to do things that aren't interesting (like waiting through a Quicktime Event) instead of letting them, well, play.
Interact. Be involved in the material. Be immersed, in those ways that games permit and movies do not.
I also just played one of the old Call of Duty games -- a World War II setting, in keeping with my current interest in history. And the biggest problem I have with this game is a similar one to that which I have with Tomb Raider 2013. I want it to be more about the purported subject, and less about generic mechanics.
Not that I think this would be easy to achieve. Or even necessarily sell well. Call of Duty is very much a "twitch" game. Now, it does focus on events -- such as the Normandy Landings -- which were incredibly fast-moving and chaotic. But I've taken part in military exercises and outside of the last moments of a banzai charge the pace is a little slower.
The first sequence, for instance, places you as a Soviet peasant conscripted into the defense of Stalingrad. It actually frames pretty well, with such cute bits as having you practice how to throw a grenade with a bucket of potatoes, that the Soviet army is poorly equipped. But then combat begins, and for all intents and purposes the only reason to ever conserve ammunition is because the reloading animation takes so long. Really, like practically every other first-person shooter, you are encouraged to hose the landscape.
I made a point of going through even large parts of the Normandy sequence with a rifle, and choosing to look through the sights rather than firing through the hip. And this slowed down the breakneck pace just a little, but it is still far from a realistic experience.
And, yes, there are nice models of appropriate settings, uniforms, equipment, There are little set-ups in film reel style, and short diary entries. Just enough to where I did sort of get the sense of being a British soldier at El Alamein (or whatever). But really the mechanics trump any need to pay attention to specific details; grab any weapon you see on the battlefield and pull the trigger whenever the cross-hairs turn red, run in the general direction of the big arrow and keep shooting until the next cutscene begins.
Now, I'm not asking to have to study a topo map and strain to understand static-swamped radio messages in order to figure out the next objective, any more than I'm asking to have to spend three hours scraping a one-meter grid with a trowel to find the next artifact in Tomb Raider. But I think there's room for a lot more context.
And I think the standard model of the first-person shooter was sufficiently exercised by the time Doom II came out. Playing as an Army Ranger at the cliffs of Point du Hoc should not be essentially identical to the experience of playing as a young archaeologist shipwrecked on an island filled with savage cultists and an ancient mystery. Let's not be afraid to tinker a bit. Especially, lets find a way to support game length other than spawning a truly ridiculous number of essentially-identical targets.
Artefact
I tried out the NextEngine 3d scanner. Confirmed that working up a sculpted prop in 2x scale, digitizing it, then 3d printing it at final scale is a plausible workflow. The NextEngine software is pretty good with stitching and with decimation/retopology, and scanning is relatively fast. There are limitations on size, however; a 4" x 5" window, for instance, for anything using the attached rotary table.
Yet I am still torn on the design of the Wraith Stone. It seems defensible within the information presented in the game that this is a manufactured thing, an item of Atlantean technology like Excalibur or the Galali Key. You could even defend the green blobs as Atlantean blinkenlights (or as some necessary aspect of their technology, like the fiddly bits Howard Taylor puts on everything technological within the Schlock Mercenary universe.)
The first here is a sculpt by DeviantArt user and cosplay enthusiast Very-Crofty. I love the beard on this one; it feels very Green Man or something (whereas the high forehead with markings feels somehow Maori to me -- a nice mix of cultural elements that feels like the real product of a complex cultural heritage)
Reverse image search on this one turned up an eBay seller but I was unable to find any real information about the artist. It is also a sculpt, and I am impressed with the way they've incorporated the "glow-y bits" into it. It also has both that "primitive" aesthetic and the worn-by-countless-hands that gives it a proper feeling of unutterable age.
Even in-game, the artifact looks more like this than it does in the development art (or the derived image that gets used in the title movie for Tomb Raider: Legend).
This is a texture -- possibly from the one-and-only in-game closeup -- ripped from Tomb Raider: Legend by a user posting at the Tomb Raider Forums. Again, the shape is much cruder, although the general proportions are still recognizable. The way this image is strikingly different is that in all the other in-game shots the stone is dark but the face/skull is picked out by being much lighter.
Ah, well, back to the Holocron. I still haven't resolved the glue issue with the "circuitry" layer. The last idea was to push the connection out to the edge and hide it behind the edge of the shell, but that makes that part of the shell too wide to look right in the Stolen Holo pattern.
Also, TechShop just got rid of all of their Epilog lasers. So I need to go back and take and pay for ANOTHER damn class just to be able to keep using the cutters. At least the new lasers are faster at engraving...
Yet I am still torn on the design of the Wraith Stone. It seems defensible within the information presented in the game that this is a manufactured thing, an item of Atlantean technology like Excalibur or the Galali Key. You could even defend the green blobs as Atlantean blinkenlights (or as some necessary aspect of their technology, like the fiddly bits Howard Taylor puts on everything technological within the Schlock Mercenary universe.)
Really, though, the kind of thing I'd like to have (and like to try building), is something that looks more like a proper archaeological artifact. Something old, worn, rubbed smooth around the edges. Something that comes from an aesthetic before the Classical Greeks and their primacy of geometry; something informed by other aesthetics that to our modern eyes looks lumpy, off-center, misshapen.
I mean this with absolutely no insult towards two wonderful renditions in this direction I managed to find.
Reverse image search on this one turned up an eBay seller but I was unable to find any real information about the artist. It is also a sculpt, and I am impressed with the way they've incorporated the "glow-y bits" into it. It also has both that "primitive" aesthetic and the worn-by-countless-hands that gives it a proper feeling of unutterable age.
Even in-game, the artifact looks more like this than it does in the development art (or the derived image that gets used in the title movie for Tomb Raider: Legend).
This is a texture -- possibly from the one-and-only in-game closeup -- ripped from Tomb Raider: Legend by a user posting at the Tomb Raider Forums. Again, the shape is much cruder, although the general proportions are still recognizable. The way this image is strikingly different is that in all the other in-game shots the stone is dark but the face/skull is picked out by being much lighter.
Ah, well, back to the Holocron. I still haven't resolved the glue issue with the "circuitry" layer. The last idea was to push the connection out to the edge and hide it behind the edge of the shell, but that makes that part of the shell too wide to look right in the Stolen Holo pattern.
Also, TechShop just got rid of all of their Epilog lasers. So I need to go back and take and pay for ANOTHER damn class just to be able to keep using the cutters. At least the new lasers are faster at engraving...
Subscribe to:
Comments (Atom)



