“In that moment, Imari grew up.”
This week, Imari flies solo. More precisely, Nagi can’t join the Let’s Explore A Manganese Mine expedition, so Our Big Little Bookworm’s insecurities come to the fore as she takes responsibility for the girls in territory none of them have visited before. Shoko is adorably confident in her chosen mentor, but Ruri gets a wee bit too snarky when deprived of her idol.
A tunnel collapse keeps them from reaching their destination, and that’s when a well-researched novel comes to the bookworm’s rescue. All’s well that ends well, and while Imari isn’t ready to fill Nagi’s… “shoes”, she doesn’t disappoint.
🎶 🎶 🎶 🎶
And I would gen 500 wives,
and I would gen 500 more.
Then quickly deathmatch through those thousand wives,
and blog the ones ranked 4.
🎶 🎶 🎶 🎶
This time around, we have a Qwen LoRA that actually works. Most of the ones I’ve tried have either changed nothing detectable or were overtrained to the point of making everything worse. Our new friend is Experimental-Qwen-NSFW, which has a very strong anime bias, and adds a touch of naughty even to prompts that don’t push its buttons. Also elf-ears and the occasional tail.
I used the revised-and-expanded retro-sf location & costume prompts, the new physical-expression-based moods, and the rest was unchanged. The LoRA exaggerated the poses and facial expressions, and despite its flaws (strange fingers, extra limbs, knock-knees, and “poorly-set broken bones” being quite common), it livened things up nicely. It even added some Moon diversity.
Downside: the refine&upscale pass had a tendency to magnify the LoRA’s flaws. In one case, it took two perfectly normal hands and added extra fingers in odd positions. In another, it changed the poses of the men in the background to match the gal’s sexy walk, which just looks goofy. About a dozen of them were made objectively worse, and another half-dozen had changes I didn’t care for, even if you wouldn’t know unless you saw the original. This is using the commonly-recommended 4xUltrasharpV10 upscaler, but I get the same sort of changes with others.
I briefly flirted with a tool for converting the metadata to a CivitAI-compatible format for uploading there, but the author silently changed its defaults to overwrite your saved originals, destroying the original metadata that lets you reload the exact settings in SwarmUI and refine/upscale. Fortunately I tested it on only one image. It also prefers to destructively modify an entire directory at once, so, yeah, not linking to that tool’s repo. It’s a simple JSON massage to the EXIF data, so I’ll just roll my own at some point. Or have Claude do it.
Speaking of Claude, I gave up on using their suggested devcontainer approach in VS Code (which I didn’t really want to use anyway), and installed the node-based Claude Code CLI inside of a VMware virtual machine running Ubuntu 25. Code is rsynced to a shared folder to make it available inside the VM, so it can’t see anything else and can only operate on copies.
(these are the ones where the defects aren’t so bad I have to re-gen them with a variation seed; some that I almost included turned out to have mottled skin tones, mostly on the legs, and I like my waifuskin like I like my peanut butter: smooth and creamy; also, without nuts)
I approve of Imari’s attempts to improve Nagi’s wardrobe. She looks good in anything, and fan-artists have gleefully expanded her range. This week, we bounce into a game of Opals For Oppai!
Interesting to see Ruri’s reaction to L’il Red Shoko choosing Imari as her preferred mentor. A small detail, but character-developing.
I sent my sister a sample screenshot from the work-in-progress gallery-wall app, and she asked if I could make it work for her, as she also has walls in need of galleries. Not being a command-line kind of gal, the ideal solution would be for me to use something like py2app to bundle in all the dependencies so she can drag-and-drop a directory onto a self-contained app.
Turns out that’s quite hard, at least if you’re Windsurf & Claude Sonnet 4.5. So hard, in fact, that in the end I told it to back out to the last commit before we started, and it gleefully did a git hard reset that erased all traces of its 90 minutes of failure. The app packaging went fine, after a few tries, it just couldn’t implement drag-and-drop or manage to quit cleanly. And the best part is that it learned nothing, and will make the exact same mistakes again tomorrow. Confidently, with exclamation points.
Pro tip: when your GUI app logs a line that says “run pkill to exit cleanly”, you have failed. Also, don’t gaslight the user by claiming that typing a directory name on the command line “is a better Mac-native solution than drag-and-drop”.
Breaking out the stone knives and bearskins, the simplest approach
seems to be a three-line change to add a native-app wrapper with
pywebview. py2app still blows chunks
if you enable drag-and-drop, but at least it bundles up all the
dependencies. The workaround is to use Automator to create another
app that just launches the real one from a shell script with --args "$@", which is conceptually disgusting but functional.
(I had ChatGPT create an icon for the app (which takes a while when you don’t pay them $20/month…); it does not feature Our Mighty Tsuntail)
Our DFC Ponytail-Bearing Redheaded Schoolgirl Pal (currently only known by her last name, although now that she’s slipped up and called Our Crystal-Crazy Heroine Ruri-chan, she’s sure to be Shoko-chan soon) stumbles across an unusual orange rock, then stumbles and loses it, leading to a deep-woods adventure that wipes out even the energetic Ruri, terminating in an abandoned factory. Whatever they were producing, the conditions were just right for making big orange crystals, as explained by Our Well-Rounded Mentor.
Nagi’s wisdom is as deep as…
(by the way, this is only the second role for L’il Red’s voice actress)
My wayward Amazon package finally arrived Friday night, and it was even intact (mildly surprising since it had no packaging whatsoever, just a label slapped on the side). There was no indication of how it went astray, like a second label or a half-dozen scenic postcards, but there is a punchline to the story.
The first entry in Amazon’s tracking has it starting out in San Diego, CA on the 28th. From there it went to: Cerritos, CA on the 29th; Hodgkins, IL on the 2nd; Bell, CA on the 6th; La Mirada, CA on the 7th; Hodgkins, IL and two cities in Ohio on the 9th; then the UPS depot up the street from me early on the 10th, and finally onto my front porch that evening.
The punchline? The shipping label on the package says it really shipped from Hebron, KY. Which is about an hour’s drive from my house.
(I guess it just wanted a little more flight time)
…is that it convinced Amazon I’m interested in ‘LitRPG’, a genre I have repeatedly run away screaming from. Not just because the genre is cursed with premature subtitlisis and epicia grandiosa, things that have been turning me away from overambitious new authors for decades.
The book? Fun, although I kept getting distracted by on-call alerts, so it wasn’t an in-one-sitting kind of read. I’ll buy the next one.
(announcing your Grand Epic Plans on the cover of your first novel is a curse that was infesting the SF/fantasy mid-list back when publishers used to sign damn near every first-time novelist to a three-book contract with ambitious delivery dates, only for both sides to discover that it takes more than a year to write a decent sequel to a book that was written part-time over five years)
In the end, I didn’t have Claude restructure the YAML file from
$color/$loc/$time/$type to scene/$type/$color/$log/$time; instead
I had the bright idea of molesting it with a one-liner (unpacked for
clarity):
grep : scenes.yaml |
perl -ne '
next if /^#/;
($s,$k) = m/^( *)([^:]+):$/;
$i = int(length($s)/2);
$p[$i] = $k;
print "." . join(".", @p[0..$i])," ",
join("/", @p[0,4,1,2,3]),"\n" if $i == 4
' |
sort -k2 |
while read a b; do
echo "# $b"
yq $a scenes.yaml
echo
done
TL/DR: I used the indentation level to populate an array, printed out
the original structure as a yq selector and the new structure as a
path, sorted by the new path, then dumped out each section. After that
it was a single search-and-replace to indent all the items, and a
quick Emacs macro to convert the paths into the new YAML structure.
The thing that took the longest was removing the redundant indented
keys, which technically wasn’t necessary to create the correct YAML
structure.
Probably took less time than writing an explicitly detailed request to Claude.
To overcome the Apple-imposed limitations on wallpaper changes, I
instructed Claude to write a little Python script that shuffles
separate sets of images for each display at a chosen interval. I
called it waifupaper, of course:
#!/usr/bin/env python3
"""
Waifupaper - Changes MacOS wallpapers at fixed intervals
Bugs:
- doesn't work if wallpaper is currently set to rotate.
- fails to load images if called without full path to directories.
"""
import argparse
import os
import random
import subprocess
import sys
import time
from pathlib import Path
from collections import defaultdict
def get_directory_state(directory):
"""Get the current state of a directory (modification time and file count)."""
directory = Path(directory)
try:
# Get the directory's modification time
mtime = directory.stat().st_mtime
# Count image files
image_extensions = {'.jpg', '.jpeg', '.png', '.bmp', '.gif', '.tiff', '.tif', '.heic'}
file_count = sum(1 for f in directory.iterdir()
if f.is_file() and f.suffix.lower() in image_extensions)
return (mtime, file_count)
except Exception:
return None
def get_image_files(directory):
"""Get all image files from a directory."""
image_extensions = {'.jpg', '.jpeg', '.png', '.bmp', '.gif', '.tiff', '.tif', '.heic'}
directory = Path(directory)
if not directory.exists():
print(f"Error: Directory '{directory}' does not exist", file=sys.stderr)
sys.exit(1)
if not directory.is_dir():
print(f"Error: '{directory}' is not a directory", file=sys.stderr)
sys.exit(1)
images = [
str(f.resolve()) for f in directory.iterdir()
if f.is_file() and f.suffix.lower() in image_extensions
]
if not images:
print(f"Error: No image files found in '{directory}'", file=sys.stderr)
sys.exit(1)
return images
def get_display_count():
"""Get the number of connected displays."""
try:
# Use system_profiler to get display information
result = subprocess.run(
['system_profiler', 'SPDisplaysDataType'],
capture_output=True,
text=True,
check=True
)
# Count occurrences of "Display Type" or "Resolution"
count = result.stdout.count('Resolution:')
return max(1, count) # At least 1 display
except subprocess.CalledProcessError:
return 1 # Default to 1 display if command fails
def set_wallpaper(image_path, display_index=0):
"""Set wallpaper for a specific display using AppleScript."""
# AppleScript to set wallpaper for a specific desktop
script = f'''
tell application "System Events"
tell desktop {display_index + 1}
set picture to "{image_path}"
end tell
end tell
'''
try:
subprocess.run(
['osascript', '-e', script],
check=True,
capture_output=True
)
except subprocess.CalledProcessError as e:
print(f"Warning: Failed to set wallpaper for display {display_index + 1}: {e}", file=sys.stderr)
def main():
parser = argparse.ArgumentParser(
description='Rotate wallpapers on Mac displays at fixed intervals',
formatter_class=argparse.RawDescriptionHelpFormatter,
epilog='''
Examples:
%(prog)s ~/Pictures/Wallpapers
%(prog)s ~/Pictures/Nature ~/Pictures/Abstract -i 60
%(prog)s ~/Pictures/Wallpapers -s -i 120
%(prog)s ~/Pictures/Nature ~/Pictures/Abstract -1 -3
'''
)
parser.add_argument(
'directories',
nargs='+',
help='One or more directories containing wallpaper images'
)
parser.add_argument(
'-i', '--interval',
type=int,
default=30,
help='Interval in seconds between wallpaper changes (default: 30)'
)
parser.add_argument(
'-s', '--sort',
action='store_true',
help='Sort images instead of shuffling (default: shuffle)'
)
parser.add_argument(
'-1', '--display1',
action='store_true',
help='Only affect display 1'
)
parser.add_argument(
'-2', '--display2',
action='store_true',
help='Only affect display 2'
)
parser.add_argument(
'-3', '--display3',
action='store_true',
help='Only affect display 3'
)
parser.add_argument(
'-4', '--display4',
action='store_true',
help='Only affect display 4'
)
parser.add_argument(
'-v', '--verbose',
action='store_true',
help='print verbose output'
)
args = parser.parse_args()
# Determine which displays to affect
selected_displays = []
if args.display1:
selected_displays.append(0)
if args.display2:
selected_displays.append(1)
if args.display3:
selected_displays.append(2)
if args.display4:
selected_displays.append(3)
# If no specific displays selected, affect all displays
affect_all_displays = len(selected_displays) == 0
# Validate interval
if args.interval <= 0:
print("Error: Interval must be a positive number", file=sys.stderr)
sys.exit(1)
# Get display count
num_displays = get_display_count()
if args.verbose:
print(f"Detected {num_displays} display(s)")
# Validate selected displays
if not affect_all_displays:
for display_idx in selected_displays:
if display_idx >= num_displays:
print(f"Warning: Display {display_idx + 1} selected but only {num_displays} display(s) detected",
file=sys.stderr)
# Filter out invalid display indices
selected_displays = [d for d in selected_displays if d < num_displays]
if not selected_displays:
print("Error: No valid displays selected", file=sys.stderr)
sys.exit(1)
# Prepare image lists for each display
display_images = []
directory_states = {} # Track directory modification times
# Determine which displays will be managed
if affect_all_displays:
managed_displays = list(range(num_displays))
else:
managed_displays = sorted(selected_displays)
if args.verbose:
print(f"Managing display(s): {', '.join(str(d + 1) for d in managed_displays)}")
for i in managed_displays:
# Use the corresponding directory, or the last one if we run out
dir_index = min(managed_displays.index(i), len(args.directories) - 1)
directory = args.directories[dir_index]
images = get_image_files(directory)
if args.sort:
images.sort()
else:
random.shuffle(images)
display_images.append({
'images': images,
'index': 0,
'directory': directory,
'display_index': i # Store the actual display index
})
# Track initial directory state
directory_states[directory] = get_directory_state(directory)
if args.verbose:
print(f"Display {i + 1}: {len(images)} images from '{directory}'")
if args.verbose:
print(f"\nRotating wallpapers every {args.interval} seconds")
print("Monitoring directories for changes...")
print("Press Ctrl+C to stop\n")
try:
iteration = 0
while True:
# Check for directory changes before setting wallpapers
for display_data in display_images:
directory = display_data['directory']
current_state = get_directory_state(directory)
# If directory state changed, reload images
if current_state != directory_states.get(directory):
display_num = display_data['display_index'] + 1
if args.verbose:
print(f"📁 Directory changed: '{directory}' - reloading images...")
new_images = get_image_files(directory)
if args.sort:
new_images.sort()
else:
random.shuffle(new_images)
display_data['images'] = new_images
display_data['index'] = 0
directory_states[directory] = current_state
if args.verbose:
print(f" Loaded {len(new_images)} images for display {display_num}\n")
# Set wallpaper for each display
for display_data in display_images:
images = display_data['images']
current_index = display_data['index']
actual_display_idx = display_data['display_index']
image_path = images[current_index]
image_name = Path(image_path).name
if args.verbose:
print(f"Display {actual_display_idx + 1}: {image_name}")
set_wallpaper(image_path, actual_display_idx)
# Move to next image, wrap around if needed
display_data['index'] = (current_index + 1) % len(images)
# Reshuffle when we complete a cycle (if not sorting)
if display_data['index'] == 0 and not args.sort and iteration > 0:
random.shuffle(display_data['images'])
if args.verbose:
print(f" → Reshuffled images for display {actual_display_idx + 1}")
iteration += 1
if args.verbose:
print()
time.sleep(args.interval)
except KeyboardInterrupt:
if args.verbose:
print("\n\nWallpaper rotation stopped.")
sys.exit(0)
if __name__ == '__main__':
main()
Fun fact: Apple’s virtual desktop ‘spaces’ have their own wallpaper settings, which means that each display has different wallpaper settings for each ‘space’. And if you want to keep the menubar on your main display, you have to tick the ‘use same spaces on all displays’ setting.
But ‘spaces’ are not manageable via Applescript, so changing wallpaper affects only the active space. Which means that if this script is running in the background, it will effectively follow you from space to space, updating the wallpaper on the active one. Which is kind of what I wanted anyway, but isn’t what the built-in rotation does. Apple’s standard behavior uses undocumented private APIs, which is a very Apple way to do things these days.
Revised the lighting & composition wildcards, revised the retro-SF costume wildcards. Next up will be throwing all the retro-SF location prompts into the Claude-blender and having it generate new ones broken down by category; not only are the current ones getting over-familiar, they come from several different models and a variety of prompts, with varying degrees of retro-SF-ness. After that I’ll probably throw the pose file at it; I did some manual categorization, but it’s still a real mish-mash of styles. Either that or the moods and facial expressions, which most models don’t handle well conceptually; I’m going to try asking for the physical effect of words like “happy”, “sexy”, “eager”, “playful”, “satisfied”, etc, and see if it produces something an image-generator can differentiate from “resting bitch face”.
Yet Another Example of words not to use with Qwen Image: the fashion term “cigarette pants” is taken literally, with half-smoked butts randomly placed around the hips. Only once did it put one in the gal’s hand, and I don’t think it ever interpreted it as “skinny pants”.
Also, I’m making a note to go through the costume components and downweight many of them; quilting and padding wear out their welcome quickly, especially when they get applied to gloves and make it look like she’s wearing oven mitts.
🎶 🎶 🎶 🎶
I won’t count fingers and toes,
as long as you make pretty waifus,
Genai, you fool.I won’t count fingers and toes,
but I want limbs on the correct sides,
and each type should have only two.
🎶 🎶 🎶 🎶
[wow, I must be tired and distracted by the combination of a houseguest and a busy on-call week; I didn't even notice that I never reviewed the episode...]
This week, The Tale Of The Abandoned Rock-Lover Who Finally Found a Home. With occasional really goofy face distortions that look like CGI rotations without the middle bits. On the plus side, we get one of Our Gals into a bikini. On the minus side, it's Ruri, who's definitely girl-shaped, but no competition even for Gal Gal, much less Our Varsity Over-The-Shoulder-Boulder-Holder Team.
That Package(TM) has moved again. Two days after leaving California, it was back to the same depot in Illinois that it visited a full week ago. And then it made it to (the other end of) Ohio Thursday night. Looking this morning, it appears to have reached the depot that’s literally down the street from my house.
So unless it goes back to California again, I should finally have it tonight.
…without telling me you know nothing about legibility:
(also, “tell me you’re desperately hoping people will mistake your derivative crap for Japanese derivative crap…”)
I took a YAML file of lighting/composition/angle prompt components and threw it at Claude, instructing it to break them up into categories like indoor/outdoor, day/night, portrait/natural lighting, color/black-and-white, etc, then flesh out each category in the new YAML file up to 50. It worked quite well, with a few exceptions:
Next up will be applying the same categorization and refresh to the settings, poses, and outfits, but not until I recover from having a houseguest this week…
Just noticed that Crunchyroll has a PG rating for this show, promising nudity and profanity. So not happening.
Anyway, Ruri suffers a brief bout of imposter syndrome, then rereads her notes to discover that she did miss something, and only careful review kept them on the right track. In the end, they not only find the sapphires, but uncover a bit of local lore.
What sort of hills and valleys will they explore next?
I rearranged things in my office so that the 4K vertical monitor sits between the MacBook Air and the Mini, connected to both. Given that both Macs have widescreen displays, naturally I configured the Dock to appear on the left, as I always do. Except that In Their Infinite Wisdom, Apple has decreed that if the Dock is on the left, it must appear on the left of the leftmost monitor, even if that is not the main monitor. You can force the menu bar to appear on the main monitor regardless of position, but if the Dock is on the side, it must be aaallllllllll the way to that side. Even if that’s literally several feet away from the display that’s right in front of your face.
So I had to move the Dock to the right of my laptop display, like some sort of heathen who eats from a dumpster. It’s quite unnerving. I don’t look there. I never look there. I have over twenty years of practice not looking there.
Amazon order for $RANDOM_OBJECT placed on September 28. Shipped via UPS on September 29. Until about an hour ago, the last update from UPS had it in Illinois on the 2nd, but now it’s arrived in… California!
I’m old enough to remember when Amazon was good at logistics. Also “packaging items so they don’t get destroyed in transit”.
OpenAI has discovered that it bought a pig in a poke with Jony Ive’s Mysterious Pocket AI Device. For 6.5 billllllllion dollars. To make it work, whatever “it” is, they not only need more AI server capacity, but also a better-than-their-flagship-chatbot to drive it. And the Ars commentariat will be there to do whatever it is they do.
They really should have asked Grok and Claude if this was a wise investment. Or maybe they did…
Yeah, there’s a lot of hand-waving involved in how to do this, with no provision for “hey, your official Dockerfile blew chunks”. Meanwhile, I’m not the only person frustrated with Windsurf’s expensive failure policy. It does no good for them to have a polished GUI if it eats up your monthly credit balance in an AI-enhanced “abort, retry, fail” loop.
Not that I actually want to use VS Code with or without a vendor-specific skin on top. The only reason I even tried a pay-to-play “AI” IDE is that it produced a better coding experience for my initial test project (at no cost to me), and all of that goodwill was lost when the second project got bogged down and had to be spoonfed half a dozen screenshots to get it back on track. The vacation scheduler was over 50 passes in when I ran out of trial credits, with some functionality still untested because of blocking bugs, and it might have been able to fix them if it hadn’t eaten a bunch of credits.
Anyway, I bit the bullet and got a paid Claude subscription for a while, and if I can get their Coding UI to work in a secured sandbox, I’ll turn it loose and see what it does to the remaining bugs and feature requests. No chance in hell I’m going to give a Node.js “AI” app direct access to my shell…
(“no, you can’t tempt me; I know there’s node.js under that fur!”)
I was reviewing image-generation prompts enhanced by an LLM, and ran across something worse than having it randomly switch to Chinese (which it also did):
Note: Please replace “her” with the appropriate pronoun depending on the woman’s gender.
Yeah, that one’s going in the trash heap. The LLM, not just the prompt. (it was a derivative of OpenAI’s free model)
This week, Thicc Girls goes double platinum. Can’t wait for the world tour. Poor Imari really got thrown in the deep end, though; “Oh, we’re not panning in the river today, we’re taking these boulders to the mountains!”
New explicit video from Maplestar. (there are some known errors in this version that will be fixed soon; none of them affect the “thrust” of the story…)
(he gets her in the end; also, she gets him in the end)
Yesterday’s attempts at app enhancement foundered on the rocks of “Model provider unreachable”. It still charged me “credits” for trying, but didn’t do anything. This would be annoying if these weren’t free trial credits, but it still brings me closer to running out. Pro tip: don’t keep trying when you hit this error…
Dropping from the preview of Sonnet 4.5 to standard Sonnet 4 might have reduced its “intelligence” a bit, but had the advantage of actually running. I hadn’t switched to 4.5 anyway; it did that for me to promote the new one, charging less credits to do… less.
Today’s original-requirement-finally-implemented is all-day events. I think I have just enough credits left in my trial to add a user-prefs page with defaults and password changes (“using best practices for security”), session cookies (“using best practices for security”), and deep-copying an existing project into a new one.
Once I run out of credits, then I decide if it’s worth $15/month to throw more of my dusty old projects at it. Alternatively, since Windsurf is mostly just a custom VS Code skin connecting to third-party models (with some secret sauce in the system prompts and context handling), and it’s been suggesting the Claude Sonnet models, I could download VS Code and the Claude extension and pay them directly, since I’ve gotten good creative results from the free level of Claude.
It also appears that you can run Claude’s coding assistant inside a Docker container and just export your repo directory into it, allowing it to run commands on a completely virtualized environment without network access to anything but their LLM API (once it downloads its CLI tool, which of course is written in node.js, sigh).

(Will I release the final project? Sure, why not; it’s not like I wrote it…)
It’s always surprising when they mention that Ruri’s in high school, since she often behaves like a much-younger girl. This week, they add a reminder in the form of a classmate who can stack up next to Our College Gals. Also a future partner-in-rock-crime who’s slowly being nudged in front of the camera.
The mineral-of-the-week is itty-bitty little grains of sapphire, slowly being traced up-river by tedious visual inspection of individual grains of sand. My back hurts just looking at Ruri leaning over a microscope; there’s a reason professors pawn this sort of work off on grad students, and that Nagi pawns it off on Ruri…
I expected more from Ai Shinozaki’s first tentacle video.
Seven Seas has announced new translated manga titles, including Let’s Run an Inn on Dungeon Island! (In a World Ruled by Women). This is softcore porn featuring a runty little guy who gets transported to a world where big buff women dominate, and as they wash up on his deserted island, he alternately bangs them silly and endures cliché role-reversed sexual harassment. Alternate title: Cocksman of Reverse-Gor. Of course he also has OP cheat magic.
I haven’t owned an honest-to-Ansel photo printer in a very long time. I had one of the original 4x6 HP PhotoSmart printers with the lickable ink, and I had a rather finicky dye-sub printer for a while, but both were back when I was shooting on film and scanning slides with a color-unmanaged SCSI Nikon scanner that had an unreliable automatic slide feeder, which would be 20+ years ago.
It’s been on my mind for a while now, and the cleanup of my office made room for one, but I managed to talk myself out of buying a big one, settling for Canon’s A3+ (13” x 19”) model. Mostly because of sticker shock over the cost of ink cartridges for the A2/B3 PRO-1100.
Color management has come a long way over the past 20 years, to the point that every print I’ve tried has come out exactly as expected, including my genai gals and memes. And the ecosystem around fine-art printers is stable, with third-party paper companies publishing ICC color profiles that smoothly integrate into the workflow, so that you can pretty much just click “print”.
The only significant advance in framing technology, on the other hand, is the widespread availability of non-reflective coated glass at framing shops. You’ll still get a lot of glare on premade frames, but if you can afford professional framing, your stuff will be a lot more viewable.
Heartbreakingly expensive, though. Even with the fortunately-timed 70%-off sale at Michaels, my last framing batch was the size of a mortgage payment. Filled up a lot of wall space, but still not something to do lightly.
If I stick to commodity gallery-style frames in standard sizes, the PRO-310 will pay for itself, the ink, and the paper by filling just one wall of my living room with my Japan pics. The only trick is that for many common frame sizes, I need to print multiple images on one sheet of paper and cut them apart with a rotary trimmer, but I have one of those, and Canon’s software allows you to save custom multi-image print layouts and quickly drag images into them.
(and, yes, I printed Red Waifu)
Speaking of which, I made a batch of 500 retro-SF dynamically-prompted pin-up gals in 9x16 aspect ratio to serve as rotating wallpaper on the vertical monitor, then deathmatched them down to 18, and while one poor gal spontaneously grew an extra finger during the refine/upscale process, it should be fixable with a bit of variation seeding. Later; at 33 minutes/image to refine and upscale, I’m done for the day.
I usually just post full-sized images and let the browser scale them down, but 18 2160x3888 images is a lot of mumbly-pixels, even lazy-loading them, so click the preview images to see the big ones (no, not Nagi).
Our second adult fan-service provider has entered the show in the form of a busty underrim-glasses-wearing bookworm, Imari. Could this show get any better? Well, yes, but jello-wrestling probably isn’t on the schedule.
This week, everyone falls in a hole in order to stumble across the Mineral Of The Week. Nagi is not only the voice of reason, but also the voice of A Proper Mentor, getting Imari out of her comfort zone and advancing her career path.
Verdict: thicc is apparently justice.
My deathmatch project was quite small and self-contained, and even offline LLMs were able to wrap their tiny little “minds” around it, more or less. For my next attempt, I used the free trial of Windsurf to build something more elaborate: a vacation planner that mixes the categorized lists of Trello with the easy drag-and-drop of the Planyway plugin. That is, jettison all the other crap in both apps, and just have a bunch of events that can easily be moved around both between days and within days, so you can lay out an itinerary for the day and quickly update it on the fly. The real bonus is getting timezones right, which Trello still doesn’t do; it always edits and displays events in the web browser’s local time zone.
This time the experience wasn’t so smooth. I’m a dozen passes in, and it’s deferred the implementation of all of the drag-and-drop features because that’s apparently hard. The basic framework is there, but despite being part of the original specs, a fair number of features required multiple attempts to implement at all, much less correctly. Timezone issues required multiple screenshot uploads with detailed explanations. On the bright side, the screenshots and explanations actually worked.
In other words, it required a very detailed set of specifications, didn’t implement all of them, and would never have made any progress at all without extensive human testing and debugging experience. Even though this is still a tiny application (~1,000 lines of Python, ~1,000 lines of Javascript, and ~800 lines of HTML/CSS), it couldn’t be “vibed”.
Interesting note: it never occurred to the LLM that the color used to display text on a colored background mattered. I had to invoke the magic words “best practices for accessibility” to restrict the background color palette, and “strongly contrasting color” to ensure legibility. It then used actual Web Content Accessibility Guidelines. I’d required best practices for security and authentication, but did not add the same wording to each other section of the specs…