Various posts about the Google Anti-Trust ruckus and the question of whether the Big G is a monopoly occasionally provide nuggets of enjoyment. But I think my favorite is the quip that no price Microsoft could ever offer, because it’s pretty much true.

Having lived through the era that was MSN Search in the 1990s, I have a mostly negative inclination towards Microsoft as a search provider, although the times I’ve ended up using Bing (usually not willingly), I haven’t had any particular complaints about. Nor other major providers this side of millennium.

But I think it’s still a simple key fact that Google does dominate search. DuckDuckGo, for example, is rather successful. We even have several general search engines, and some (maybe even most) don’t suck. But there’s no chance in hell that most would ever stand a chance at unseating Google in monetary terms such as getting made the default engine. Nor, do I really anticipate them being unseated by simple technological advancement or their own ineptitude because unless Google Serious Fucks Up(tm) they will likely remain dominate.

So yeah, there may be more than one search engine, but good luck competing with that. Ya know, it’s billion with a B, right?

People still love WordStar?

Encountering a nifty article on Robert Sawyer’s recent release of WordStar, I can’t help but think that by now, even the creators of WordStar would agree that it is abandonware.

Actually, just about any piece of software with its heritage in the CP/M era should probably be considered past its commercial viability. In the sense that if you’re still making a living off software that is over 40-years old, it may be time to encourage your customers to upgrade their software 😂. I’d be surprised to find many younger than myself who even know what the program is, never mind learning it today, because it’s been quite a freaking while since it was a popular program.

WordStar 7 at least is a version from a time where MS-DOS was mature, and still predates most of what people younger than me identify as a computer. Looks like Mr. Sawyer went full-tilt boogy with trying to make it a full release, if anyone is fond of old software, it’s probably worth a shot. As for myself, I’m more of a vi kind of guy once we start going down that hands on keyboard rabbit hole.

The Standard Intel Response

Reading a recent article on the 13th/14th Gen Debacle, I’m reminded of how problems with Intel typically roll:

  1. There will be a microcode fix if people will shut and enough complain.
  2. Haha, you think there’s really a fix for that!?
  3. Please buy the next chips!

My mind kind of flashes back some years to the errata documents for various SoCs that I was working with, and deciding not only were there a scary amount of Won’t Fix and Even We Don’t Know What Will Happen items and other run-away-screaming level worrisome things mentioned, it made me rather start to wonder what does Intel ever actually resolve? Because quite frankly, my Latitude experienced similar issues to some of the errata items despite being 3 – 5 generations older than the SoCs that I was working with at the time.

Actually, that’s the main reason Zeta was built on an AMD platform. Having been an Intel brat since Tandy made computers (🤣), I’ve tend to prefer Intel processors over the years. Having to work more closely with hardware for part of my career, rather soured my relationship and goodwill towards Intel. What do I say that? Well, Zeta’s my first AMD machine in about 17 or 18 years….and that gives me the startling realization that it’s been almost two decades since my darling Dixie, my first laptop.

Experiences with Rimuru’s 10th generation processor and various motherboards, further exacerbate the feeling that it will either be my last conventional desktop PC, and that Intel Inside probably won’t be a boon when building or shopping for its replacement someday.

Dusting off the old iBook G4, I can’t help but wonder how the machine would behave with a solid state drive instead of its twenty year old 40G IDE drive. MacOS Tiger is surprisingly nimble except when it isn’t, and most of those involve impact to both the processor and I/O heavy operations. When I had ran OpenBSD on it, the key limits were the lack of SMP and modern javascript engines having moved on with a lack of 32-bit PPC support.

The ol’ PowerPC chip is about as impressive as a single core CPU can get, I suppose. But the hard drive is basically a potato. The trick however, is it would be a major pain in the ass to replace the drive even if one of my MicroSD to EIDE bridges would fit without a hub bub.

Ahh dang it, why is temptation such a problem when it involves old computers? SMH!

Daylight – 60 fps e-paper?

Now that’s kind of impressive. The main negatives of e-ink is that the refresh rate is generally trash and the resulting flashing during bigger refreshes can be quite jarring on the eyes. But they work really damn well as long as you don’t need things to move or change on screen, thus they’re excellent for reading but terrible for scrolling.

The trade off for a more LCD-like “It’s off” when the power is off versus how e-ink holds onto its image isn’t that bad, and for such high refresh rates it would be more than worth it for a computer not so much for things like price tags on store shelves.

It’s also kind of interesting what they might come up with by trying to reimagine human interaction with computers, but I’d expect that to be a lot less successful than the display technology, due to the crushing weight of established conventions. But just the same, I’m kind of interested in what they might come up with. Ironically, I’m reminded of the Think Different speech.

Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes. The ones who see things differently. They’re not fond of rules. And they have no respect for the status quo. You can quote them, disagree with them, glorify or vilify them. About the only thing you can’t do is ignore them. Because they change things. They push the human race forward. And while some may see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world, are the ones who do.

Steve Jobs

Federico Viticci’s Not an iPad Pro Review, I think is a scathingly well done piece.

Having used Android tablets as an all-the-things primary computer for the better part of a decade, I particularly found the problem of background processes to be a killer. It effectively relegated my iPad from a fully productive machine to a fully-everything-else machine that probably cost twice as much as the Galaxy Tab S-series it replaced. In fact, I virtually never reach for the Magic Keyboard and fire up a SSH client on my iPad. I’ll break out a heavy ass laptop or walk across my home if I need that, because it’s annoying as fuck when you switch apps and then it’s gone. By contrast, I readily used my Android tablets for SSH tasks both for programming professionally (ssh -> build server) and for local things (ssh -> my servers).

I also rather like the notion of Desktop-class apps as a terminology. Apple’s spiel about tablet optimized apps vs Android was largely full of bullshit and handwaving back in the day. But I think Desktop-class captures the distinction well. My aging iPad Pro grants access to a few professional quality apps like Working Copy, Procreate, and Lumia Fusion that are available on iPadOS, but relatively few Desktop-class applications.

By contrast, my Android tablets were often good enough for the dock to a monitor and have at it modularity, because guess what? Some crazy guy probably wrote a desktop class application worth paying for, or no one actually gave a fuck. iPadOS on the other hand, well the best thing I can generally say about most of the software is that app xyz is almost the same as xyz is on Android. There are a few that I miss even when using Macs and PCs. And as Federico notes, there’s basically a plethora of things that just can’t exist on iPad because there’s no support for building them.

Multi-tasking is kind of a more meh perspective to me, but I think his description of how it’s evolved is spot on. Personally, I like the more full-screen task centric nature of Android and iOS. I wrote about that plenty of times in the 2010s back when G+ was a thing, and even a few journal entries here. The whole floating window thing, I find rather nice if you have a 20″ to 30″ monitor but not so useful when you cut that screen in half, like a tablet or a laptop. I appreciate the ability to split screen or slide over or float windows on my tablet, but not as much as I appreciated Android apps allowing me to do things like switch between a terminal session and an email without fucking up what I’m doing.

Stage Manager kind of squeezes it in for me. On Mac, I enjoy Stage Manager because it helps organize and group windows effectively for working on tasks. On iPad, I mostly view Stage Manager as a sucks-less way of switching between applications when multi-tasking more than anything else. On the flip side, iPadOS did grow the ability to do external monitors far better. But of the actual multi-tasking experience, the most that I can say all these years later, is that I no longer have to reboot my iPad constantly whenever using slide over, because I basically never use it on purpose :P.

Or should we say, I enjoyed the quality of iPadOS’s launch version so much, if anyone ever bemoans the quality of my code, I’ll just ask if they ever did much with the first version of iPadOS 😂.

A nice gander at the Apple Lisa

While the video might be a tad boring by contemporary standards, unless like me, you have an interest in such ancient technologies 😛. I think that this does make a nice demonstration of the system.

Since the guy is using actual hardware, it is also slow as crap by modern standards. Let’s just say that the world has come a long way since a Moto 68k and a meg of RAM was plenty. But I think it’s fairly impressive and innovative a system for its day.

I kind of like the more Electronic Desktop metaphor than the conventional Files and Applications approach that the typical Windows 9x PC functioned as some decades later. I love the document centric rather than application centric view as a concept. Seems like it was a good attempt at creating an environment for office workers, who weren’t computer people. The ability to have files with the same name is odd, but interesting if likely impractical for software developers. The natural saving and manipulation of content is nice.

In addition to the UI design, its relationship to the early Mac seems fairly apparent. In particular, one of the odd things that I encountered digging into 1990s PowerBooks and System 7 is how the classic Mac OS treats placing files on the desktop (basically a flag saying its on the desktop) and handling of floppy diskettes. Both rather different than modern systems of any sort. The Lisa looks like a lot of its concepts made their way into the original Macintosh and later system versions.

It’s kind of a shame that the Lisa was insanely expensive and (IMHO) rather slow, like $10,000 for a basic system. While I’m not convinced that the original Mac could be a good idea without at least a second floppy, its base price of $2,500 was at least less comical than the Lisa. Or should we say, a 512k and way more storage would probably have been worth every penny and still way cheaper than the Lisa.

Random things

Powered on Stark to test a boot stick, and figured I’d let the system go update itself. Went downstairs to wash out my coffee cup, and coming back, the line of sight from down the hall to where I left it on my desk reminds me of one of the things I don’t miss about the old Latitude: the screen!

Stark was from a transient era. One in which more consumer oriented laptops began to adapt Intel’s concept of an “Ultrabook” and more business oriented laptops refused to give up their ports until you pry’em from their cold dead motherboards. But almost universally, they all agreed on having a shitty screen compared to basically everything else in computing at the time.

As such, while the laptop served me very well it wasn’t without compromises. The typical 1366×768 pixel screen was basically trash, but it did support external displays and that’s how I tended to use Stark. Onboard was a VGA port (ha!) and the size (mini)HDMI port that nothing else really adapted, but as it got older docking stations able to drive a pair of DisplayPort/HDMI outputs were cheaper than having one shipped off eBay, and the Intel chips back then maxed out at three display pipelines anyway. Ditto irksome things like having an eSATA at the price of a super speed port, having to dedicated a USB port to a Bluetooth dongle, and needing a fanny-pack type battery to get runtime that wasn’t a joke, and weighing almost a kilogram more than I wanted to lug around every day.

But the machine also had its upsides. Like a TPM for encryption, a modular slot that could be fitted with an OEM optical drive or a replacement fitting for a second 2.5″ SATA, and a Core i5 that actually served well up until the rise of Electron applications like Teams and Slack. It also helped that I had enough Latitude D/E series compatible chargers around to never worry, except when working away from an outlet.

All in all, Stark has the unique position of being a computer that managed to not piss me off more often than not. That’s not something many computers can say. So, I think Stark was a successful machine, even if it’s going to stay retired, lol.

Network evolutions

Thus far, this is looking to be the evilicious plan.

  1. The ns1 VM on zeta will be converted from primary name server that is authoritative for home.arpa and forwards to external DNS, to being only authoritative for my LAN and providing nothing else.
  2. Cream will be unretired and converted into infrastructure.
    • DHCP services (v4/v6)
    • DNS services ala ns2: secondary for my LAN and forwarding to external DNS.
    • Add CNAME aliases for ns3 and dhcp because I’m silly that way.
  3. Reconfigure Eero to use my Raspberry Pi zero ns2 and Cream as its name servers.

The reasoning for this is while Zeta could provide DHCP services easily and the VM running name services has been effective, Zeta not a machine that I want as a single point of failure—but as a single point of truth, it’s convenient.

Notion here being if Zeta sees downtime, as long as it is less than ns2/ns3 take to expire the local zone information, the impact across my household is just the inconvenience of fixing a computer. That is to say, either transient enough of a failure not to cripple things, or epic enough to convert ns2 over into the authoritive using the already in place ready to swap over into master mode setup that I have. And that simply put, Cream wouldn’t be intended to be fucked with or rebooted, and would in tern have a similar failover of moving to Eero’s DHCPv4 and re-enabling forwarding on ns1.

Except for two pains in my ass:

  1. Cream’s CMOS battery has died in storage since its retirement.
  2. Cream is refusing to boot the install media.

Actually, I should probably just see if Magic (my old Raspberry Pi 2) is still laying around somewhere or donate Victory (my Raspberry Pi 3/8G) to the mission. I had intended to load RHEL9, but Debian and I are still on friendly terms :).

Plus, Cream’s tenor as my previous file server included a history of the NUC being a pain in my ass!

The Insane Engineering of the Gameboy

Nice video giving an overview of the classic handheld’s architecture. The opening may be a little bit harsh IMHO, but also not unwarranted. At least, the way that I look at it the hardware is closer to what a microcomputer could have passed for just over a decade prior, and devices like the Apple II or TRS-80 were hardly portable and battery friendly devices.

That’s a trend that I think largely tends to continue with really portable devices. I remember looking at data about the first Raspberry Pi, and decided it would likely be on par with a ten year old PC — except closer in size to a credit card than a microwave oven, and pretty darn cheap. Likewise, while I find the Steam Deck’s graphics very unimpressive, I find it amazing that someone crammed an Xbox One level of horse power into such a portable package.

It’s pretty darn cool how that sort of evolution plays out, even if my wrist watch literally has an order of magnitude more computing power than my first Personal Computer….