Federico Viticci’s Not an iPad Pro Review, I think is a scathingly well done piece.

Having used Android tablets as an all-the-things primary computer for the better part of a decade, I particularly found the problem of background processes to be a killer. It effectively relegated my iPad from a fully productive machine to a fully-everything-else machine that probably cost twice as much as the Galaxy Tab S-series it replaced. In fact, I virtually never reach for the Magic Keyboard and fire up a SSH client on my iPad. I’ll break out a heavy ass laptop or walk across my home if I need that, because it’s annoying as fuck when you switch apps and then it’s gone. By contrast, I readily used my Android tablets for SSH tasks both for programming professionally (ssh -> build server) and for local things (ssh -> my servers).

I also rather like the notion of Desktop-class apps as a terminology. Apple’s spiel about tablet optimized apps vs Android was largely full of bullshit and handwaving back in the day. But I think Desktop-class captures the distinction well. My aging iPad Pro grants access to a few professional quality apps like Working Copy, Procreate, and Lumia Fusion that are available on iPadOS, but relatively few Desktop-class applications.

By contrast, my Android tablets were often good enough for the dock to a monitor and have at it modularity, because guess what? Some crazy guy probably wrote a desktop class application worth paying for, or no one actually gave a fuck. iPadOS on the other hand, well the best thing I can generally say about most of the software is that app xyz is almost the same as xyz is on Android. There are a few that I miss even when using Macs and PCs. And as Federico notes, there’s basically a plethora of things that just can’t exist on iPad because there’s no support for building them.

Multi-tasking is kind of a more meh perspective to me, but I think his description of how it’s evolved is spot on. Personally, I like the more full-screen task centric nature of Android and iOS. I wrote about that plenty of times in the 2010s back when G+ was a thing, and even a few journal entries here. The whole floating window thing, I find rather nice if you have a 20″ to 30″ monitor but not so useful when you cut that screen in half, like a tablet or a laptop. I appreciate the ability to split screen or slide over or float windows on my tablet, but not as much as I appreciated Android apps allowing me to do things like switch between a terminal session and an email without fucking up what I’m doing.

Stage Manager kind of squeezes it in for me. On Mac, I enjoy Stage Manager because it helps organize and group windows effectively for working on tasks. On iPad, I mostly view Stage Manager as a sucks-less way of switching between applications when multi-tasking more than anything else. On the flip side, iPadOS did grow the ability to do external monitors far better. But of the actual multi-tasking experience, the most that I can say all these years later, is that I no longer have to reboot my iPad constantly whenever using slide over, because I basically never use it on purpose :P.

Or should we say, I enjoyed the quality of iPadOS’s launch version so much, if anyone ever bemoans the quality of my code, I’ll just ask if they ever did much with the first version of iPadOS ๐Ÿ˜‚.

Beetlejuice, Beetlejuice, Beetlejuice!

It’s only taken 36 years, but it’s finally happening!

Considering I hauled my ass out to the theater some years back for the anniversary showing of Beetlejuice, I have a feeling the long overdue sequel will probably get me to go. But the real question is, will we see a Hawaiian cameo as a nod to one of the old script ideas? ๐Ÿ˜€

Revisiting one of my favorite anime, I’m again saddened that we’ll likely never see translations of the original Light Novels. I tend to enjoy stories that take tropes, and turn them on their ear, and Maoyu is one of those. But more than that, I think it takes a more unique tack than most stories.

The Hero goes to slay the Demon Lord, and it quickly derails when he’s trying to slice down the archnemesis and she keeps side-stepping the sword and saying hello in greeting, ๐Ÿคฃ. It’s kind of a classic trope for a confrontation to the death to erupt from two such forces meeting. But it quickly flips over when faced with two key problems: if the war suddenly ends, it’ll be calamity for all and if either side wins, it spells doom for the other. Rather than fighting, they pledge each other to the other and seek the other side of the hill, that undiscovered country where both their dreams may be found–peace.

Thus the Demon Lord becomes the Crimson Scholar, finding ways for humanity to avoid the starvation and economic collapse the war’s end would cause; and the Hero becomes the Black Knight, cutting down foul human and demon alike to create a world where both sides may come together. But of course since peace isn’t challenging enough they must also face forces that would see the war continue in perpetuity for the sake of lining their own pockets.

I think that tropes are tropes for a reason. Like romantic comedies and action movies, both genres are full of tropes. The couple will end up together. The good guy will triumph over the villain. Yada, yada. Sometimes that’s a good thing, sometimes tropes make for a nice cheesy goodness; there are reasons why the patterns repeat themselves, because we enjoy the journey not just the results. Yet that also means deviating from those established patterns in amusing ways, can be quite a pleasant result.

A roux makes it better

One of the banes of my cooking experience has always been macaroni and cheese. It’s something that I’ve never been able to make the way that my mother did, and generally don’t make because I suck at it.

In Googling recipes, I encountered an interesting idea for one for a baked macaroni and cheese: using a roux as the foundation of the cheese sauce. That’s definitely nothing like how my mother used to make it, and we didn’t bake it either. I didn’t opt to try the recipe, nor bake it, but I wanted to try the concept. Because it makes sense even if it’s not how I was taught.

Result? Well, a light roux of salted butter and flour meets milk and cheese equals the best macaroni and cheese I’ve ever made. Ordinarily, I always find the results lacking or calling for lethal doses when I try to replicate my mother’s methods. Well, a roux isn’t how she did it but it sure makes for a good meal ๐Ÿ˜‹

One of the past year’s many changes, has seen my shifting from my cast irons to a pair of slightly more conventional skillets. 10″ and 12″ CAROTEs with similar finish to the tamagoyaki pan I’ve enjoyed using, have quite literally become my goto cookware.

A funny side effect, oddly enough is that I’m more inclined to make a roux when it’s convenient. E.g., I might toss a batch of stir fry meat in the skillet, set aside, and then reach for the butter and flour. Combined with the greater use of beef stock in my cooking, this seems to make for a good sauce to cook some macaroni and accompaniments in, and then just reduce until delicious and saucy.

Working off the theory that the various non-stick skillets my mother favored lasted considerably past their recommended service lives, I figure however well the granite style coating lasts, I will have gotten my money’s worth many years before they need replacement. Plus, there is the benefit that when I really need a larger skillet, these fuckers are light enough to one hand the 12″ skillet just fine. As opposed to the hefty weight of the cast irons and risk of making like Barf with a flattened foot largely makes Monstro unwieldy as a 12″ skillet, even with two hands. I love the cast iron, but in practice Geppetto the 10″ is the largest cast-iron that’s practical. Which meant that as a side affect, many times I’ve ended up using my stainless steel pots where a skillet would be more convenient, because my favorite skillets are so damn heavy, lol.

A nice gander at the Apple Lisa

While the video might be a tad boring by contemporary standards, unless like me, you have an interest in such ancient technologies ๐Ÿ˜›. I think that this does make a nice demonstration of the system.

Since the guy is using actual hardware, it is also slow as crap by modern standards. Let’s just say that the world has come a long way since a Moto 68k and a meg of RAM was plenty. But I think it’s fairly impressive and innovative a system for its day.

I kind of like the more Electronic Desktop metaphor than the conventional Files and Applications approach that the typical Windows 9x PC functioned as some decades later. I love the document centric rather than application centric view as a concept. Seems like it was a good attempt at creating an environment for office workers, who weren’t computer people. The ability to have files with the same name is odd, but interesting if likely impractical for software developers. The natural saving and manipulation of content is nice.

In addition to the UI design, its relationship to the early Mac seems fairly apparent. In particular, one of the odd things that I encountered digging into 1990s PowerBooks and System 7 is how the classic Mac OS treats placing files on the desktop (basically a flag saying its on the desktop) and handling of floppy diskettes. Both rather different than modern systems of any sort. The Lisa looks like a lot of its concepts made their way into the original Macintosh and later system versions.

It’s kind of a shame that the Lisa was insanely expensive and (IMHO) rather slow, like $10,000 for a basic system. While I’m not convinced that the original Mac could be a good idea without at least a second floppy, its base price of $2,500 was at least less comical than the Lisa. Or should we say, a 512k and way more storage would probably have been worth every penny and still way cheaper than the Lisa.

Well, that’s surprisingly neat…kind of makes me have to remember that I don’t need another clock, in unusually tempting ways.

Random things

Powered on Stark to test a boot stick, and figured I’d let the system go update itself. Went downstairs to wash out my coffee cup, and coming back, the line of sight from down the hall to where I left it on my desk reminds me of one of the things I don’t miss about the old Latitude: the screen!

Stark was from a transient era. One in which more consumer oriented laptops began to adapt Intel’s concept of an “Ultrabook” and more business oriented laptops refused to give up their ports until you pry’em from their cold dead motherboards. But almost universally, they all agreed on having a shitty screen compared to basically everything else in computing at the time.

As such, while the laptop served me very well it wasn’t without compromises. The typical 1366×768 pixel screen was basically trash, but it did support external displays and that’s how I tended to use Stark. Onboard was a VGA port (ha!) and the size (mini)HDMI port that nothing else really adapted, but as it got older docking stations able to drive a pair of DisplayPort/HDMI outputs were cheaper than having one shipped off eBay, and the Intel chips back then maxed out at three display pipelines anyway. Ditto irksome things like having an eSATA at the price of a super speed port, having to dedicated a USB port to a Bluetooth dongle, and needing a fanny-pack type battery to get runtime that wasn’t a joke, and weighing almost a kilogram more than I wanted to lug around every day.

But the machine also had its upsides. Like a TPM for encryption, a modular slot that could be fitted with an OEM optical drive or a replacement fitting for a second 2.5″ SATA, and a Core i5 that actually served well up until the rise of Electron applications like Teams and Slack. It also helped that I had enough Latitude D/E series compatible chargers around to never worry, except when working away from an outlet.

All in all, Stark has the unique position of being a computer that managed to not piss me off more often than not. That’s not something many computers can say. So, I think Stark was a successful machine, even if it’s going to stay retired, lol.

Network evolutions

Thus far, this is looking to be the evilicious plan.

  1. The ns1 VM on zeta will be converted from primary name server that is authoritative for home.arpa and forwards to external DNS, to being only authoritative for my LAN and providing nothing else.
  2. Cream will be unretired and converted into infrastructure.
    • DHCP services (v4/v6)
    • DNS services ala ns2: secondary for my LAN and forwarding to external DNS.
    • Add CNAME aliases for ns3 and dhcp because I’m silly that way.
  3. Reconfigure Eero to use my Raspberry Pi zero ns2 and Cream as its name servers.

The reasoning for this is while Zeta could provide DHCP services easily and the VM running name services has been effective, Zeta not a machine that I want as a single point of failureโ€”but as a single point of truth, it’s convenient.

Notion here being if Zeta sees downtime, as long as it is less than ns2/ns3 take to expire the local zone information, the impact across my household is just the inconvenience of fixing a computer. That is to say, either transient enough of a failure not to cripple things, or epic enough to convert ns2 over into the authoritive using the already in place ready to swap over into master mode setup that I have. And that simply put, Cream wouldn’t be intended to be fucked with or rebooted, and would in tern have a similar failover of moving to Eero’s DHCPv4 and re-enabling forwarding on ns1.

Except for two pains in my ass:

  1. Cream’s CMOS battery has died in storage since its retirement.
  2. Cream is refusing to boot the install media.

Actually, I should probably just see if Magic (my old Raspberry Pi 2) is still laying around somewhere or donate Victory (my Raspberry Pi 3/8G) to the mission. I had intended to load RHEL9, but Debian and I are still on friendly terms :).

Plus, Cream’s tenor as my previous file server included a history of the NUC being a pain in my ass!

The Insane Engineering of the Gameboy

Nice video giving an overview of the classic handheld’s architecture. The opening may be a little bit harsh IMHO, but also not unwarranted. At least, the way that I look at it the hardware is closer to what a microcomputer could have passed for just over a decade prior, and devices like the Apple II or TRS-80 were hardly portable and battery friendly devices.

That’s a trend that I think largely tends to continue with really portable devices. I remember looking at data about the first Raspberry Pi, and decided it would likely be on par with a ten year old PC — except closer in size to a credit card than a microwave oven, and pretty darn cheap. Likewise, while I find the Steam Deck’s graphics very unimpressive, I find it amazing that someone crammed an Xbox One level of horse power into such a portable package.

It’s pretty darn cool how that sort of evolution plays out, even if my wrist watch literally has an order of magnitude more computing power than my first Personal Computer….