“The difference between leftwing ‘activism’ and the zombie apocalypse gets smaller by the day.”

— Rafi, on blocking traffic

Anime Endings: R.O.D The TV


The original Read or Die OAV was a Bond-movie spoof with superpowers. Like most of Roger Moore’s Bond films, the action, humor, and engaging characters kept you from getting hung up on the basic silliness of the plot. And, of course, Yomiko’s paper-mastery power was novel and visually impressive. I liked it.

I liked R.O.D the TV more, despite its flaws. Why? Like some of my other favorites, it’s all about the characters. The “big plot” that ties it in to the events of Read or Die is not only silly, but overexplained as well. Nearly an entire episode is wasted on clunky “as you know, Bob” exposition, and the villain doesn’t just gather the heroes for one final monologue, he gives them an open mic to all his henchmen.

The truth is, the series didn’t need a “big plot”, and it definitely didn’t need one that depended so strongly on characters from the OAV, while fundamentally altering their personalities. I like Nenene and the Paper Sisters, and I think they could have carried the show on their own. Deep down, I think the writers knew this, too, which is why the “slam-bang action finale” took up so little time in the last episode, and was followed by quiet scenes of the cast getting on with their lives. Ultimately, R.O.D the TV wasn’t about finding Yomiko, saving the world, or even paper-mastery; it was about these four women.

I knew up-front that the series would include Bond-ish clichés and a villain whose plans made Doctor Evil look sensible. I knew there’d be paper-mastery. I figured Yomiko had to show up eventually, especially when I discovered Nenene’s history in the manga. In those respects, the series met my expectations. In the way it handled the personalities and relationships of Michelle, Maggie, Anita, and Nenene, it exceeded them.

The way it presented Joker was jarring if you’d seen the OAV, but a bit less so if you’d also read the manga. Wendy’s change was poorly explained, and made even less sense if you’d read the manga. As for Gentleman, “well, that came out of nowhere”.

Apple iPod Store


We tried to stop at the Palo Alto Apple Store on Friday, only to find it closed for a brief-but-thorough renovation. It reopened today, and it’s very iPod-centric now. You can buy new Macs there, but if you’re looking for software, books, or other accessories, the shelves are pretty bare.

The optimistic interpretation is that they’re temporarily compensating for sluggish Mac sales caused by the x86 announcements.

Salmon teriyaki


I’ve been looking at Japanese cookbooks recently. The first one I bought, 英語でつくる和食, is fun to read, since it puts both the original Japanese recipe and an English translation on facing pages. After trying out a few things, however, I’ve come to suspect that the English versions were never tested by people who only spoke English.

So, a few days ago I picked up The Japanese Kitchen, which is meticulously organized by ingredient, and gives sample recipes for each. One of the examples for soy sauce was salmon teriyaki, with homemade teriyaki sauce.

Most teriyaki dishes I’ve had have been pretty awful, and the sauce had a lot to do with that. Obviously, they weren’t using homemade. If you have access to a gourmet or asian grocery store, you should be able to find what you need:

  • 3 parts soy sauce
  • 3 parts sake
  • 3 parts mirin (sweet cooking sake)
  • 1 part granulated sugar

(comparing this to the list of ingredients on a few bottles of commercial sauce explained a lot) Bring to a boil, stirring to dissolve the sugar. Lower the heat, and simmer until the sauce has reduced by about 25%. Let cool.

Making the salmon isn’t any harder. Heat the oven to 350°. Cover a baking sheet with foil, put a wire rack on it, and lay the salmon fillets (4-6 ounces each) skin-side down on the rack. Baste with the sauce, put it into the oven for five minutes. Repeat until your trusty digital thermometer reads about 145°. Pour some more sauce on the fillets and serve.

I might try grilling them next time, although that’s risky on my nuclear Weber. I will try grilled teriyaki beef kabobs with this sauce. Maybe that’s Monday night’s dinner…

Subtle changes in Tiger


I finally upgraded my primary Mac to Tiger, because the 10.4.2 release seems to have stomped most of the bugs I cared about (or the bugs I cared most about; works either way). There are some things that I don’t consider improvements, like the Mail.app UI, but so far only one change has actually annoyed me: emacs broke.

More precisely, the vt102 emulation in Terminal.app changed just a tiny bit, forcing me to remove the stty -tabs line that’s been in my Unix dotfiles for the past 18 years. It’s a pity that I got rid of my honest-to-gosh DEC VT102 about ten years ago, or I could file a truly outraged bug report with Apple.

Admittedly, the fact that this is the first time they’ve done something that broke my dotfiles is actually a pretty good sign.

Dogpile on the creationist


I’m afraid I’ve lost patience with Joe’s sophistry over at the usually-enlightening Cold Fury. I didn’t expect anyone’s responses to change his mind; nearly two decades on Usenet convinced me that the best you can hope for is that you’ll give the audience something to chew on for a bit. Still, he’s so ignorant about science, and so convinced that he understands it, that you just have to slow down as you drive by and check the accident scene for bodies.

[In truth, I didn’t actually have much patience with him when I initially jumped in, because reading the previous responses made it clear that he wasn’t actually engaged in honest debate on the subject. And it amused me that the forces of science and reason were so ably represented by an old friend and new co-worker.]

Perhaps not all of this summer's movies will suck...


Just watched the high-resolution trailer for Terry Gilliam’s The Brothers Grimm (normal, boring Quicktime trailer here). The 720p version isn’t entirely smooth on my 1.25 GHz G4 Powerbook, but still quite gorgeous; upgrading to Quicktime 7 Pro and switching to full-screen mode improved it a bit (and, of course, enabled the various save and edit options as well).

The woman in the trailer who looks a lot like Claudia Black apparently isn’t, by the way.

Minolta Maxxum 7D glitch


[Update 7/23/05: okay, the rule of thumb seems to be, “if you can’t handhold a 50mm f/1.4 at ISO 100-400 and get the shot, spot-meter off a gray card and check the histogram before trusting the exposure meter”. This suggests some peculiarities in the low-light metering algorithm, which is supported by the fact that flash exposures are always dead-on, even in extremely dim light.]

[Update 7/22/05: after fiddling around with assorted settings, resetting the camera, and testing various lenses with a gray card, the camera’s behavior has changed. Now all the lenses are consistently underexposing by 2/3 of a stop. This is progress of a sort, since I can freely swap lenses and get excellent exposures… as long as I set +2/3 exposure compensation. I think my next step is going to be reapplying the firmware update. Sigh.]

The only flaw I’ve noticed in my 7D was what looked at first like a random failure in the white-balancing system. Sometimes, as I shot pictures around the house, the colors just came out wrong, and no adjustment seemed to fix it in-camera.

Tonight, I started seeing it consistently. I took a series of test shots (starting with the sake bottle, moving on to the stack of Pocky boxes…) at various white balance settings, loaded them into Photoshop, and tried to figure out what was going on. Somewhere in there, I hit the Auto Levels function, and suddenly realized that the damn thing was simply underexposing by 2/3 to 1 full stop.

Minolta has always been ahead of the curve at ambient-light exposure metering, which is probably why I didn’t think of that first. It just seemed more reasonable to blame a digital-specific feature than one that they’ve been refining for so many years.

With that figured out, I started writing up a bug report, going back over every step to provide a precise repeat-by. Firmware revision, lens, camera settings, test conditions, etc. I dug out my Maxxum 9 and Maxxum 7 and mounted the same lens, added a gray card to the scene, and even pulled out my Flash Meter V to record the guaranteed-correct exposure. All Minolta gear, all known to produce correct exposures.

Turns out it’s the lens. More precisely, my two variable-aperture zoom lenses exhibited the problem (24-105/3.5-4.5 D, 100-400/4.5-6.7 APO). The fixed focal-length lenses (50/1.4, 85/1.4, 200/2.8) and fixed-aperture “pro” zoom lenses (28-70/2.8, 80-200/2.8) worked just fine with the 7D, on the exact same scene. Manually selecting the correct exposure with the variable-aperture zooms worked as well.

These are the sort of details that make a customer service request useful to tech support. I know I’m always happier when I get them.

Photoshop tips


Apropos of nothing, I thought I’d mention that the two most recently posted pictures here were resized in Photoshop CS, using the new(-ish) Bicubic Sharper resampling method, available in the Image Size dialogue box. I hadn’t seen any mention of it until about two weeks ago, and had been using Mac OS X’s command-line tool sips for quick resizing.

Bicubic Sharper is much better than the standard Photoshop resizing, sips, or iPhoto. It’s particularly good for rendered images with fine detail. I’ve been working on a Roborally tile set for Dundjinni, creating my basic floor texture with Alien Skin Eye Candy 5: Textures. Dundjinni expects 200x200 tiles, but Eye Candy renders best at larger sizes. Resizing down from 800x800 using the straight Bicubic method produced an unusable image. Bicubic Sharper? Dramatically better.

I found the tip in a discussion of photo-processing workflow, which makes sense. For a long time, photographers have been making Unsharp Mask the final step in their workflows, because if they sharpened at full size, the slight softness introduced by resizing for print or web use would force them to use Unsharp Mask again, which tends to look pretty nasty. Integrating it into the resizing algorithm takes advantage of the data you’re discarding, reducing the chance of introducing distracting artifacts.

“Need a clue, take a clue,
 got a clue, leave a clue”