I don’t really remember the 2010 Census, but I do find it a touch amusing that in 2020: we basically get a web link, and a UUID.

Somehow that seems like great progress, but still amounts to several sheets of paper in the mail just to serve notification.

Over the course of my life, I’ve mostly determined that a few things are relatively true about e-mail:

  1. Email is either the best or worst invention, probably both if you grew up with paper.
  2. All mail user agents pretty much suck.
  3. All standard protocols for dealing with mail are ancient.
Point one is something that I’ve concluded since the ‘90s. Point two is mostly internal bias. Point three should probably be considered fact at this point.

An experiment in laz^H^H^Hstubbornness with clocks

Sometime after getting a smart phone in 2010, I eventually changed from having my 90 decibel alarm clock to having one of my always on devices handle the job. Today that equates to my tablet and a triple alarm system: one to make sure the other two wake me up, one to wake me up, and one to make sure I don’t go back to sleep. As a consequence for a very long time now: I’ve been down to just having to change my stove top clock, and the clock on my car head unit.

At the last time change I decided to skip updating the clock on top of the stove, and let it keep until today’s time change. Because I’m frakkin’ tired of setting clocks twice a year, even if I’m down to so few clocks.

The real question is whether or not my brain will quickly adjust to reading stove time as actual observed time, or continue to automatically subtract an hour. Yeah, fun.

My Decade with the iPad: Upping the Ante
https://flip.it/-_vnWx

For me it was the Asus Eee PAD Transformer, the original model TF101. My Linux powered netbook had fairly limited battery life compared to the bottomless battery life of a docked TF101, and the desktop struggled under loads that Android breezed through on even less powerful hardware. On the flip side even the lowly netbook could compile code far faster, but couldn’t handle the rising UI load of modern web pages and desktop applications.

Or as I like to remember those days, if all I did was type notes into a vtty, my netbook would often be dead during one flight, and was mostly dead weight on longer trips. That experience traveling lead me to consider a rooted Android just for the battery life. The TF101 was kind of special in that it had a good battery life, and that it had a slightly smaller one in its clamshell keyboard.

The tablet with the keyboard dock had enough juice to take three planes, and fall asleep watching Netflix before needing to charge. After that travel experience, I went on to using Android pretty extensively as laptop and desktop replacements until last year.

Beefy endurance compared to Intel brought me into the platform for getting stuff done. Having an excellent lean back on the couch experience kept me using it.

Having finished The Outer Worlds, I’m reminded of the last time I enjoyed an RPG that much. It was probably Dragon Age: Origins. Which are very different genres: Outer Worlds is a science fiction shooter set in a caricature world; Origins was a sword and spell tactical game set in a fantasy world.

There’s two really specific ways the games connect in my mind, aside from the level of fun.

One of the things I rather enjoyed is the open ended way of conversing. In both OW and DA, you can pretty much respond to given situation how you want. Will your interactions be kind hearted, greedy, or antagonistic? It’s up to you. While some games insert hilarious options, The Outer Worlds, like Dragon Age: Origins: is very consistent in this execution of choice. Down to the point that it may as well be a running gag being able to introduce yourself as the former captain of The Unreliable instead of yourself. Plus there’s the case of choices that actually make a difference, and party interaction.

Another is the Not Another Sandbox Thing. I really enjoyed the Elder Scroll games, for an example. But the 2000s will probably be best remembered as the era of sandbox games, and when shooters traded the dozen guns in your back pocket for MMO-like skill attributes. But I don’t really like “Open world” sandboxes as a game design. I find that they often cause a lose of focus, and in many games not made by Really Big Makers Of Games, it often feels more like a copout rather than a benefit. In fact even when it’s made by big fish it still feels that way quite often. By contrast, Outer Worlds and Dragon Age: Origins are more like a series of small contained environments. You get the open-world aspects of being able to choose where you go, and how you go questing. But you’re not dumped in a sandbox and left to wander around. I find this lends a greater focus to problem solving, and questing.

Pretty tersely: The Outer Worlds is probably the best modern RPG game that I’ve played in quite a while.

Moronic: thinking about snacks, and forgetting that I bought a box of Pocky on sale.

Scathingly brilliant: remembering I also have a thing of salted edamame in the freezer…

This is my version of a large breakfast, given that I rarely eat much for breakfast.

Of course I get plenty of stares from the peanut gallery.

Actually, 85% of the reason I buy the sausage is to share it with the minions.

An update to my previous post, but dropping priority to “Normal” mode.

General system responsiveness is normal. Handbrake’s average frames per second county thingy now goes to ~8.1 on average with extremes being more like 7.7~9.3, but it pretty well hovers at 8.1.

By contrast, applications like writing this post in a browser are now normal performance. It’s even possible to watch previous encode without glitches and artifacts in the decode. But things like writing this post cause more dips in Handbrake’s frame rate.

Overall, it seems fair to say “High” -> “Above Normal” priority loses 5% if you want a nice round number instead of a range for the average. “High” -> “Normal” priority loses 15% by the same method.

Where “High” = cripples the desktop session, “Above Normal” = makes the desktop session feel a few hardware generations behind, and “Normal” = makes it just another process.

I guess that solves that, lol.

A while back, I remember tuning the default priority in HandBrake’s preferences, back when I made the leap over to HEVC. Mainly because the encode times are so long on my old Core i5-3570K, and because I tend to leave such jobs running overnight.

Well, for curiosity sake I’ve decided to see how monkeying with this changes things a bit.

In my experience, nicing processes on my unix systems is rarely worth the bother. That is to say, a nice system like Linux this side of multi-core processors tends to remain pretty responsive, and most people I encounter screwing with niceness tend to be pissing down the wrong problems to start with. But NT is not unix, nor Linux, so who knows.

Looks like the default priority I had configured was “High”. Task Manager also shows a “Real Time” but that isn’t in the GUI.

Net result is Handbrake proceeds at a pace of roughly 8.9 ~ 9.5 frames per second. Which works out to being closer to 9 than 10. Set to high priority also means that while the encode is running my desktop session is virtually useless, as things like updating UI state takes a back seat to next Tuesday.

After the first item in my queue finished, and the next began: I lowered the priority to “Above Normal”. Impact on encode is roughly 8.9 ~ 8.3 frames per second on the average county thingy. So that’s a performance loss of about 3 ~ 6 percent, but my desktop session is actually usable. UI updates like the process table in task manager now update in something closer to real time than melting an Titanic sized iceberg with a zippo lighter, and general productivity is passable. Typing this post is about on par with running a web browser on an older computer. Small price to pay.

That roughly equates to the encode takes less than five minutes less, if I render my desktop useless until the task finishes. Provided current pace keeps. When the entire encode takes a bit more than an hour per episode, it’s kinda a meh perspective. Also a little nicer seeing my processor bouncing between ~70% and 99% in task manager instead of 97 ~ 99 %, lol.

I’m kind of reminded of XP, and my opinion that it could be very stable if you didn’t do nasty things (or need shit drivers) to it, but being user responsive when under heavy loads it was not. As beautifully as the NT desktop has evolved since then, I still don’t think melting a Windows box is as smart an idea as melting a Linux box, if you’ve really got to melt your system’s load.

When the third file starts encoding, I’ll probably try dropping the priority to normal and see what happens.