“Okay, what’s my motivation for sex?”
"You're a man, dear; anything short of death is motivation for sex."
— Where The Boys Aren'tNot mine, sadly, but the ancient NetEngine WebEngine that was dotclue.org for so many years. I pulled it from the co-lo on my way into work this morning, and its reward for fourteen years of faithful service will be a disk scrub and an e-waste bin.
By the way, for all the sometimes-deserved criticism that OpenBSD and its wranglers get, I was still running v3.3 without anyone ever successfully breaking in. I locked it down with a very small set of services, and required non-root logins with ssh keys, and Theo’s Paranoid Army took care of the rest. I applied the various security patches that came out in 2003-2004, but that’s it.
I don’t recommend not updating your server for 14 years, but you can go a lot longer between updates if you start with something designed for security.
Amusingly, I still own the even-older server that hosted munitions.com back in the days when it was shared between folks at WebTV, but I doubt I have anything left that could mount those disks to scrub them, so they’ll just get the sledgehammer treatment, and then go into the e-waste bin.
I refuse to apologize for what happened when the soundtrack from Mary Poppins was playing as I read Mauser’s comment about Lollygagging. Dedicated to Roman Polanski, of course.
“…and every pass I plan to make,
involves Delicious Cake.
A law, or three, won’t save young girls from me…“Some champagne and a quaalude made the little girl go down,
the little girl go down,
the little girl go down.
But the booze, pills, and ass-rape were all felonies, I found,
so I fled the USA.”
It felt lonely in here, so I got Isso
working for comments. Easy to nuke-and-pave if I don’t like it, at
least. The whole “Python virtualenv” experience was a real pain in the
ass, though, since pip install repeatedly claimed to have installed
all the dependencies, while pip list called bullshit on that.
I’ll probably have to put Monit on the server in case it crashes, but that can wait.
It’s possible to have Isso dynamically update the comment count in the article metadata block, but I just spent about an hour failing to get it to work, between Isso’s and Hugo’s overlapping limitations.
On the Isso side, you can either show the comment form or add counts to a page. They’re conflicting JavaScript includes, according to the docs. I could write my own bit of jquery to make an ajax call to retrieve the count and insert it into the page, but I thought that would be more work.
Until I ran into Hugo’s variable-scoping. When you render content in a
list context, you’re really fully rendering each page in its own
context and then including the results. So, inside a template,
variables like $.Kind and $.URL refer to the individual article’s
context, as if you were currently writing out that one article to
disk. And of the two completely different ways you can set variables,
one of them is strictly block-scoped, and the other is strictly
page-scoped. You can’t pass either down into a partial template.
(there’s a partial-with-arguments called a shortcode, but that’s a completely different beast, and I’m not sure it is either effective or efficient to replace all your partials with shortcodes) UPDATE: completely impossible, in fact; shortcodes don’t work in template files, and partial templates don’t work in content files. They’re completely different things with completely different behaviors. You have to construct a custom dictionary and pass it into a partial template, which is butt-ugly and error-prone.
So, yeah, no comment-count on the home page at the moment.
I wrote my own bitty Jquery function to use Isso’s API directly and insert the comment count on page-load. It would be nice if the API returned “0” instead of 404 errors when there aren’t any comments, though.
A long time ago, in a Usenet newsgroup far, far away, in response to a post on “Top Ten Reasons Magic is Better than Sex”, I wrote:
(I dug this out because I found the old “recently-spotted” link where someone had translated them all into Spanish. Link was still good, to my surprise.)

(from the NSFW game HuniePop)
Shamus didn’t precisely recommend HuniePop, but he did say that the Bejeweled-ish gameplay was far superior to the original and most of its clones, and that he felt quite uncomfortable with the dating-sim elements, particularly the “overnight date” where you play a twitch version of the puzzle to “score” with the young ladies. So, to be more precise, he did recommend it, but only to pervs who like anime-style cheesecake and hilarous simulated moaning with their match-3 puzzles.
The gameplay is engaging, and it’s completely free of the fetishes it would have if it were a real Japanese dating sim. Meet girls, impress them with your Mad Match-3 Skilz, admire the naughty pictures they send you, and giggle at the noises the voice actresses make as your score goes up and down in the twitch puzzles. There is an easy-to-apply “adult” patch if you buy it on Steam (create a file with the correct name), but all it does is unlock a few pictures that are more detailed and less appealing.
The art and voice acting are mostly quite pleasant in the dating-sim component, making the girls quite appealing. The unlockable characters are pretty easy to get, and include alien bounty hunter Celeste, catgirl Momo, and love-fairy Kyu. Once you’ve collected the whole set (pokémon joke omitted…), there’s one final secret character, and then an “unlimited” mode.
There are two major drawbacks: it only saves when you leave an area
(so you can’t upgrade your stats and buy/sell things, then exit), and
the Mac version stores your save file in the cache folder, which can
get wiped if you upgrade the OS. This is apparently a common problem
with games built on the Unity engine. So, be sure to save the contents
of this directory frequently:
~/Library/Caches/unity.HuniePot.HuniePop
Welcome to the first non-trivial update to this blog since 2003. Things are still in flux, but I’m officially retiring the old co-lo WebEngine server in favor of Amazon EC2. After running continuously for fourteen years, its 500MHz Pentium III (with 256MB of RAM and a giant 80GB disk!) can take a well-deserved rest.
The blog is a complete replacement as well, going from MovableType 2.64 to Hugo 0.19, with ‘responsive’ layout by Bootstrap 3.3.7. A few Perl scripts converted the export format over and cleaned it up. LetsEncrypt allowed me to move everything to SSL, which breaks a few graphics, mostly really old Youtube embeds, but cleanup can be done incrementally as I trip over them.
Comments don’t work right now, because Hugo is a static site generator. I’ve worked out how I want to do it (no, not Disqus), but it might be a week or so before it’s in place. All the old comments are linked correctly, at least.
Do I recommend Hugo? TL/DR: Not today.
Getting out of the co-lo has been on my to-do list for years, but I never got around to it, for two basic reasons:
I was hung up on the idea of upgrading to newer blogging software.
I didn’t feel like running the email server any more, and didn’t like the hosting packages that were compatible with MT and other non-PHP blogging tools.
In the end, I went with G-Suite (“Google Apps for Work”) for $5/month. Unlike the hundreds of vendor-specific email addresses I maintain at jgreely.com, I’ve only ever used one here, and all the other people who used to have accounts moved on during W’s first term.
Next up, working comments!
Actually, next turned out to be getting the top-quote to update
randomly. The old method was a cron job that used wget to log into
the MT admin page and request an index rebuild, which, given the tiny
little CPU, had gotten rather slow over the years, so it only ran
every 15 minutes.
The site is now published by running hugo on my laptop and rsyncing
the output, it’s not feasible or sensible to update the quotes by
rebuilding the entire site. So I wrote a tiny Perl script that regexes
the current quotes out of all the top-level pages for the site,
shuffles them, and reinserts them into those pages. It takes about
half a second.
Since there are ~350 pages, there will be decent variety even if I don’t post for a few days and regenerate the set. If I wanted to get fancy, I could parse the full quotes page and shuffle that into the indexes, guaranteeing a different quote on each page (as long as the number of quotes exceeds the number of pages, which means I can add about 800 blog entries before I need to add more quotes. :-)