From Rimuru to Ranga

Increasingly, I’ve been turning my mind to what will come after Rimuru; a machine that was originally built in 2021 using the COVID-19 stimulus as its foundation and the same general design of its predecessor, Centauri. Since then, it has undergone 6 refits between Rimuru experiencing a motherboard failure in addition to ordinary tech updates.

Simply put, the status quo for the last few years has been that only one slot on the board is still functional, and the intention was that there would be no third motherboard if it fails. Combined with what is now a 5-year-old Core i7, the single slot of RAM has proven to be the key bottleneck. Ironically, getting Oblivion: Remastered to run was more an exercise in getting the GPU load to a point where the CPU isn’t pegging out.

It’s also been a downside that between the old CPU being well loaded and the Big-Assed-GPU both cranked up practically turn the machine into a space heater. I decided the machine to handle sustained load while keeping system thermals under control. The catch-22 of course, is I can easily find myself sitting in a room that climbs towards +10 degrees after a long spell of gaming, like playing Silent Hill f over the weekend.

Following Maleficent, I considered swapping the GPU and NVMe drive over to Zeta, and converting it from a file and virtual machine server over to Rimuru’s successor. That actually was how Centauri had become my previous desktop. Of course, breaking down and cracking the case revealed roughly what expected to be with that plan: I could fit the PSU and the cooling system, or I could fit the GPU. Zeta’s PSU would be able to handle ‘technically’ fitting and powering Rimuru’s RTX 4070 Ti, but would require removing the liquid cooling system to accommodate the PSU. So, that plan failed.

One of my long-term plans over the past lustrum or so has been that Rimuru would likely be my last conventional “Desktop PC.” I’ve never really been a believer in gaming laptops, but it here we are.

Christened Ranga, since its job is to blow Rimuru away. Amusingly, using Oblivion: Remastered as a point of reference it delivers similar performance but the opposite bottleneck. Rather than being CPU bound, Ranga is GPU bound, but still firmly lands in the realm of pick your frame rate. Closer to 30 at Ultra/4K or closer to 60 at Medium/4K, and a pretty slick 40s-50s at High/4K.

A bit of rewiring all the things, and my dock is now situated underneath the monitor rather than within a passive Thunderbolt 3 cable length of the desktop. Somehow, the part that bothers me about this arrangement is that a 2 meter long active Thunderbolt 5 cable cost about the same as my shorter TB3/TB4 cables did, while being rated for 80 Gbps/240W, far higher than my dock can handle. On the flip side, for cooling purposes a small stand was necessary to ensure proper ventilation.

In tests so far, I’m finding that the Zephyrus G14 is a sufficient match. Its RTX 5070 Ti mobile just can’t match the horsepower of the RTX 4070 Ti desktop, but it comes close enough that no loner being bottlenecked on the Core i7-10700K and single slot resolve that pickle. It’s Ryzen AI 9 HX 370 both represent a major generational leap in performance, and while the RAM remains comparable, it isn’t so limited: so yay for being back to dual channel memory!

As an added benefit, when putting Shion in place to be my primary computer, I no longer have the problem of not being able to see where the fuck the port is, since it’s no longer facing the wall. I kind of liked having my laptop off to the side as previous, but the occasions where I actually use my laptop as a notebook PC make it grumble some to reconnect. More so than swapping between TB cables at the dock. Now? It’s simply swap laptops in the stand, a single cable running to the dock.

Another benefit is proving to be the heating. The Zephyrus G14 is very rapid to crank its fans into high-gear when gaming, to the point that one might want noise canceling headphones rather than speakers for some content. But it doesn’t raise the room’s ambient temperature as drastically as my desktop, and frankly, the late generation MacBook Pro 16s had louder fans :-P.

One of those random backlog of things to write my thoughts about

Bumping into “Apple found clever iPhone Air innovation for a thinner USB-C port” a few weeks ago, made me rather do a double take.

Also, it made me rather try to imagine what the engineers that worked on the F-14 Tomcat must have suffered. Electron beam welding a wing box from titanium, along with the more general “How the hell do we even build that” problem, were among the challenges back in the ’60s. A time frame where these solutions were more revolutionary than antiquated. We mostly remember those planes for the swept wings and cool movies, but I bet the engineers who worked on that wing box remembered it as a challenge of a lifetime πŸ˜….

And then fast forward about sixty years, and we have people talking about 3D printing titanium.

When you name a server Maleficent

Recently, I’ve been having a good bit of grumbling more than usual where Zeta’s bridging of VMs into the local network segment gets borked by package updates, enough so, that pulling the trigger on my migrate to AlmaLinux 10 plan was accelerated. Rather than waiting for ELevate to consider this upgrade ‘not beta’ I went with the reinstall process.

In debating whether I wanted to go ahead and set up the libvirt environment again and keep grumbling, or perhaps just go with my original plan of using Docker, I opted to take a different tactic. The master nameserver being a VM, was mostly because hosting virtual machines was added to the expectations list when Cream was replaced; and some readers might recall, that the ol’ NUC7 got unretired into becoming nameserver 3 as part of the Asus->Eeero transition.

So, I decided on Plan Bβ€”bare metal. A MINISFORUM UN100L and a drive to MicroCenter later, and I had decided on two things. One, is that $180 on sale would be damn worth nothing having to screw with the virtual network bridge again, and secondly that I would name it Maleficent because I was pissed off at solving these problems.

The real question is stability. It’s been quite a while since I last edited the zone files (December), and more than a few incidents of the “Why the hell is ns1 not reachable again!” since Zeta’s inception. If Maleficent serves as the new name server 1 until Christmas without any fuckery, I will call that a solid win.

In unboxing the new hardware, I also considered a third alternative that may be for a longer reaching plan. The issue of lacking Thunderbolt aside and whether or not both Rimuru’s graphics card and the machine’s power supply can both fit in the case, Zeta’s hardware would actually be a great replacement for Rimuru. The issue of cramming a RTX 4070 Ti into a tiny ass case aside.

With Cream and my spare Raspberry Pi Zero W functioning as name servers 3 and 2, it would actually be simple enough to convert Maleficent into the central server. The bind instance functioning as the master / name server 1 for my internal domain is locked down, other than domain transfers, all the traffic actually goes through Cream and the Pi Zero. It’s existence as a separate entity is largely administrative, and in fact, the two name servers serving my home network are running a configuration designed so that either of them can be swapped over into becoming the SOA for the local domain. So, I wouldn’t feel too bad if bind and samba lived on the same machine. In fact, it would be quite effective since Zeta’s storage array is connected to a 5 Gbps USB-A host port, and Maleficent’s N100 is far faster than my old laptop’s aging Core i5.

That however, is a tale for another time. For now, all hail maleficent.home.arpa !

ARM ftw

Away from its charger for 4 days of light to medium usage, Shion is only down to 45% charge–I think it’s fairly safe to say the M2 has good battery life.

Makes me recall my first laptop, whose Sempron would generally reach 2 hours and 30 to 50 minutes if one was lucky. At the time, that actually wasn’t bad for an x86 laptop, never mind the third cheapest at Best Buy. It was a machine best used with a charger except for short spurts of being on battery, regardless of system load.

For the most part, I pretty much forget that my MacBook Air even has a battery.

One of the side effects of the RAID-mode oops incident has been having to re-rip and encode my Blu-rays and DVDs. At this point, most of the anime collection is basically done, but movies are in the “As needed” case because of the time/effort.

Recently, I was in the mood both, for watching Pacific Rim and taking a look at one of my original reference videos from back when I setup my previous AVC/x264 presets in HandBrake. I.e., Prometheus. In the years since then, I shifted over to an HEVC/x265 and slowly started to adopt it. Most discs since then have been anime or few and far in-between, so not as large a sample set.

So, naturally, this was the preset I chose when ripping Pacific Rim. However, I found myself disappointed in the video quality. Fortunately, I still enjoyed it greatly–as one of my favorite films and one that I haven’t seriously watched in a few years.

In particular, the opening sequence and numerous cases of darker scenes exhibited artifacts. Now, my original AVC preset wasn’t perfect but it wasn’t that bad either. Taking the first chapter, I decided to do a bunch of experiments, focused on the parts most prone to artifacts. The logo’s background fire effect, the star field, and the breach, followed by the more general video quality of the next 5~6 minutes of the opening.

EncoderQualitySize (MB)Bitrate (Mbit/s)TimeComments
Blu-rayN/A38,32022.8N/AReference Blu-ray ripped with MKV.
x264RF 201,01014.806:05Reference AVC.
Limited artifacts
x265RF 18949.113.813:11Like reference AVC
RF 2079411.511:39Close to AVC reference.
Not as good as RF 18.
RF 22688.19.806:56Reference HEVC
Too many artifacts.
Video ToolboxCQ 801,78025.701:08Close to AVC reference.
(not as good)
CQ 701,08015.701:08Close to AVC reference.
(not as good)
CQ 22448.96.501:07Like a mid 2000s video game
(only better)
CQ 18453.26.301:07Like a mid 2000s video game
(really)
CQ 104246.101:07Like a mid 2000s video game
(too close for comfort)

The AVC and HEVC reference referred to above, are my presets. For x264, high profile level 4.1 was used with the “medium” preset. For x265 auto was used for both profile/level, with the “fast” preset. The only adjustment for the experiments were the Constant Quality, which for those encoders is a logarithmic scale where higher numbers are worse quality.

For Video Toolbox, I couldn’t find any documentation about the scale but the tests obviously show higher numbers are higher quality. In each case, the “medium” preset was used.

Based on what I found, I’m kind of disappointed with the x265 cases. Perhaps it’s time to experiment with kicking it to the medium preset or enabling a deblocking filter to compensate. For the most part though, the quality is there sufficiently if comparable bitrates are thrown at it. The downside of course is that basically doubles the encoding time from x264.

The Video Toolbox case is more impressive, but also not so useful. I believe the M2’s encoder is a lot better than the ‘Bridge and early ‘Lake era Intel encoders. But in much the same way, they just don’t serve my purposes. To make my M2 achieve good enough quality for streaming, the file sizes balloon to near the original Blu-ray–so may as well not bother transcoding in that case. But still, we’re talking about a speed of 190~200 fps encoding versus about 30-40 fps encoding. I think it’s better suited for video editing than streaming video from my server to my TV.

The difference though is considerable. At the uber quality levels, it’s still subpar for Netflix/YouTube quality at this point, versus a Blu-ray reference.

Partly though, I’m tempted to revert back to using x264 and partly tempted to just leave it at Blu-ray. I didn’t really change from AVC to HEVC to save on disk space, so much as I did it because the more modern codec was now widely available on most of my hardware. The kind of perspective that AVC is still fine, but I assume devices will hold onto HEVC support longer once AVC becomes to new MPEG-2 :D.

There’s also the option to just stick with MakeMKVs output. My entire Blu-ray collection probably represents about 4 TB to 5 TB of data at this point, and ahem, it’s an 8 TB storage array with 6 TB free. My storage concerns were pretty much solved two sets of hard drives ago, back when my server’s storage was made up of 3 TB drives rather than 8s. The playback concerns, well, much like HEVC capable devices becoming the norm, most of my devices have less concern with Blu-ray quality bitrates at this point.

The amusing nature of memory consumption and the modern web. Going by Activity Monitor’s memory page as a measure, wikipedia is pretty darn memory efficient at a couple hundred megs–for a website that’s about 95% content aside from the CMS, I’d call that’s pretty good. For sake of a value, one of the random pages loaded were around 180 – 200 MB.

By contrast my own website (also mostly content aside from the CMS) and Google’s home page, burn around 400-500 MB. Which to me, feels excessive, but to be fair, part of why I started to believe in whole-lotta-memory designs was in the 2010s when I realized 2 GB was not enough to surf the web anymore, unless you loved trashing hard drives. Most content-heavy rather than ad heavy websites, fit that description as well.

Now for comparison? A few common news sites that you’ll see just about any compute nerd visit: around 1.5 – 2 GB for basically any article and easily reaching 3 – 4 GB in activity monitor. Largely depending on the amount of ads, and whether or not it’s a website that endeavors to load the content first. So basically, the modern websites consume so much RAM that we should be charging websites for the resources their advertising consumes rather than supporting them by not running ad blockers 😜

Ya know, if you just throw enough videos modern web advertising may be to computer memory usage what the pop up ad was to the 1990s web browsing experience. Seriously, web ads were Really A Lot Worse Back Then ™ when actual pop up windows were allowed.

At this point, I think the only thing that consumes more RAM than the modern web is working with high resolution graphics and artificial intelligence with loads of parameters, lol.

Reading “BOFH: The USB stick always comes back – until it doesn’t,” I’m quite sure I’ve known a few sticks that meet this description.

I still remember the pain and suffering that was the performance of USB sticks, back when I had to do some installation framework goodness for an embedded system. The whole live bootable stick was taking about 2 1/2 hours to load per test, so eventually I reached for the pair of USB sticks I was carrying at the time. One of those $10 sticks reduced the I/O wait time down to about 35 minutes, making for a much faster development cycle; so much so that production bought a bundle of those for use in the lab.

And then there was that other $10 stick, which I never realized was so drastically slower than the first. That sucker was so damn slow that it was “Free to good home USB stick, just don’t bring it back!” and of course, I met that stick more than a few times over the years.

Ya know, I should’ve just used the soldiering iron πŸ”₯

Reading “I was skeptical about Snapdragon laptops. A work trip made me a believer” from my news feeds, I couldn’t help but snicker at the actual story content.

See, the first time I had the joys of three planes to reach a destination, I was rocking an Asus EeePC 1015-series that I principally kept in text mode and low key to be able to write on. It didn’t even last the first flight never mind the entire 12 to 14 hour trip. All it literally had to do was run vi to be sufficient for my purposes.

The next time I ended up on such a trip, I was rocking an Eee Pad Transformer TF101 and found it to be a world of difference. Instead of a dead battery on the first flight, I literally had enough power left to fall asleep to Netflix in my hotel room before I even reached for my charger.

In the years since, Intel has managed to “Not suck” on power efficiency but only in relative terms in my experience. It’s not crazy to get a decent work day out of an Intel powered laptop, provided you are not pushing it too hard. But once you ramp up the workload the battery life tanks accordingly as measured in hours away from an outlet. By contrast, my first ARM powered tablet never blinked unless I was compiling code in a chroot for 8 hours straight, on a system that wasn’t meant to work that hard in the first place.

I kinda look at the notion of Windows on ARM with hopefulness, because frankly I think it’s about damn time. What helped PCs take over the computer world is the insane compatibility that IBM and Microsoft compatible machines afforded. Almost all of the Windows 95/XP targeting games in /dev/closet just work on modern machines, despite Intel PCs having changed drastically in every sense. Hell, I still have machines that will boot MS-DOS digital eons after both it and floppy diskette ceased to be relevant.

Compatibility is the best reason to avoid different CPU architectures, especially for ‘brainless’ perspectives. But we passed the point where native code was the only option: Apple’s Motorola to PowerPC migration and the original Java virtual machine proved that well enough in the wild. That was several decades ago, and performance has improved all over. Today, my M2 MacBook Air can literally run video games in Rosetta 2 well enough that I wouldn’t know they weren’t native \o/.

One of the things people often forget though, is that Microsoft originally did something really clever back in the 1990s. Windows NT was designed with multiple architectures in mind. In practice, the MIPS and DEC Alpha support didn’t mean much since almost everyone would be using a 386 with a load of memory, or soon would be dominated by x86 one way or another. But NT hedged many bets.

Personally, I have no real love left for Intel despite being an Intel brat most of my PC life. ARM in my experience, better delivers what I tend to want in a battery-powered machine but that tends to be different than what I want in a desktop PC. Likewise, having dealt with more than a bit of shouting, “Damn it, Intel!” back when I was an embedded monkey, I have even less love professionally than personally. Even more so the further you go from purely CPU and into their other pieces of the puzzle.

They had a good run, but I say hasta la vista, Intel.

WiBArM

There was a game that I played as a child on our old Tandy 1000 that I’ve wanted to look up for about the last 20 years or so. A relative had sent us a copy, but no one really could read it since the instructions were in Japanese. The 5 1/4″ diskette however, worked just fine.

One of those problems with being a kid, even if I can remember things from way back in more details than I probably should, is that memory is more visual than auditory for me. My best recollection of the name was something like ‘Wib barn,” but hey I was like 5 when I played the game! It’s not like I could spell yet! Literally, I was using MS-DOS before I could read…lol.

Needless to say, trying to find the game on this side of the floppy diskette era has been largely fruitless whenever I’ve tried.

Well, last night I was watching a video on 80s game development, and noticed that the clip of Thexder was Really, Really similar to what I remember but definitely not the same game. Sadly, doing some research into the game also showed that its sequel wasn’t it. Also, I’m pretty sure that I never encountered the Firehawk games personally.

Attempting to find similar games led me to the similar games tab on Giant Bomb, which began as mostly another exercise in futility. I’ve tried to look up the game in the past, always without success. Like seriously, how many late 80s side scrollers were there where you can transform between a robot, and a jet, and a car while exploring pseudo 3D dungeons reminiscent of the Windows 95 Maze? Yeah, you’d think that’s easy, but a lot of old stuff on old video games never made a big impact on the Internet. There were more than a few such ‘robot’ games, but I don’t think any that combined all three modes.

Then in scrolling through the list, I come across one word and it’s like “Holy shit, I remembered the name right,” and lo’ and behold: their page on Wibarm even matches my childhood memory to a tee on the screenshots. It’s without a doubt the same game, and it checks all the boxes: the three modes, the side scroll and 3D like dungeon, and the almost RPG like battles when you encounter the mobs.

So now, after many years, I finally know what that game was, and that my childhood memories are even more accurate than I expected.

Thoughts on Home Audio

In general, I haven’t cared too much about whole home audio, but have had that available long enough to not really care.

The first setup I had for that, was based around Chromecast almost a decade ago. The 5.1 Vizio surround sound system I had in my apartment had Chromecast support, and the Chromecast Audio was very cheap and very effective when paired with the AUX/Line-In on the analog based Logitech 2.1 that was hooked up to my desktop. No real complaints about the multi-room audio support, although the sound system was enough to fill my apartment.

Sadly, those both audio devices went the way of the trash heap over the years. And I haven’t really used the Chromecast audio since the move to Desktop=Games, Laptop=Desktop. The move from Android to iPad tablet made that less of a concern, since the iPad Pro was only secondary to the surround sound system in terms of speaker quality at home, and could fill my entire apartment with music almost as easily.

The second setup, which is still in use, is an Alexa based one. Over a lustrum, things expanded from a simple puck based Dot, to an orb with a clock. Enough to easily have the audio controls over my shit in the bedroom and kitchen/living room space of my apartment. These days, one of those is still on my headboard and the other is in my kitchen/dining area; and I bought a Pop to gain Alexa control in the study.

Because of how good Nerine’s speakers were, I never really cared too much about the whole home audio was, even after losing the surround-sound setup. Doing multi-room audio with Alexa worked well enough in my experience. But in practice, I only tend to need current room audio and at most, briefly next room audio. The kicker however, is that what audio device I’m playing from can vary. It’s usually going to be my tablet, but it might be my laptop, or my phone, or one of the experiment earlier this year, etc, and sometimes my devices change.

That’s kind of what lead to the Roam 2 solution. I was never really impressed with the Echo Dots for music playback, but they were good enough for anything not an iPad Pro. Since Nerine’s retirement, that’s now basically the case for anything, because the Mini can’t beat the Pro on listening to music, lol.

For me, the dots have always been more about whole-home Alexa control than whole-home audio. But really, both have been a pain in the ass in recent years. Generally, I liked Alexa control. As a voice assistant, it worked better than Siri which has always been rather meh for me and unlike Google’s, doesn’t tend to make me rage-monkey. Let’s just say, Google’s voice assistant wasn’t a concern when I left the Android eco-system.

Since Amazon’s cutting up of their Alexa division, I’ve generally found myself going more “Why do I even bother” at how well my Alexa control works, both in terms of voice recognition and third party things. Enough so, that I mostly consider its days numbered at this point. Since discovering that my watch can handle “Hey Siri, turn on the book lights,” I’m even more considering the end of Alexa control. As meh as my relationship with Siri is, when it works, it does actually work.

E.g., if my typical use case is like, “Alexa, turn on the book lights,” as I’m changing my clothes after work–there’s two ways this exchange can end. Either the lights turn on by the time my belt’s off, or I may as well go out of my way to do it by hand. Let’s just say that I’ve become grateful over the last couple years that the controllers for my Nanoleaf lights are easily accessible, and that they use a capacitive button that doesn’t make me worry about straining the adhesives.

That’s how much my relationship with Alexa control has soured in the last two years compared to how well it worked (let’s Alexa all the rooms) ~five years ago.

In terms of a Bluetooth speaker for the current room though, I’m basically calling the Alexa setup a dead stick. Much more than saying “Pair phone” and hoping the current Dot connects to my tablet, and it’s more bother than it’s worth. Pairing new devices typically ended with grumbles like the Dot connecting to my actual phone not the device I’m using, issues in getting it to connect to the device I’ve named, and the joyous fun that is connecting a new device–even if using the Alexa app.

So, a portable speaker is looking to be a good plan. Off sale, the Roam 2 costs a bit less than my surround sound system did, and on sale closer to what replacement for my JBLs were looking like on a purely Bluetooth front.

I think Sonos is too damn expensive for building my next surround sound system, and may be too expensive for my taste in terms of equipping a speaker per major room, even if I exclude the smart home control as a factor. Really, for downstairs it would make more sense to just migrate from Fire TV to Apple TV for being able to use an AirPlay target–when I eventually go surround-sound. Since the Vizio’s demise, I’ve just made due with the TV’s integrated speakers and been glad that they don’t sound like ass.

For the short to medium term plans though, both audio and smart home control are on the agenda. For right now though, AirPlay -> Roam 2 -> take it with me, is looking to be a good plan. Plus in the study, my laptop is usually docked with the Pebbles on, making it a dandy AirPlay target.