25 years and moving on

About twenty five years ago, I wondered how well Internet Connection Sharing might work. Well, I did see it work for about 5 minutes until I hit the power cycle test. Then I remembered it’s been that many years without any experience showing that windows should be used as a piece of routing infrastructure.

The first experiment resulted in losing Remote Desktop to Cream and having to fetch an HDMI cable and juggle over my keyboard and mouse. Okay, that’s fair enough, it was a 50/50 shot if I was clicking share on the right interface.

Second experiment actually worked great. Sharing the wireless interface caused Rimuru to gain itself an acceptable DHCP response and route traffic through Cream at 192.168.137.1 and was placed on a similar subnet, complete with access to the one true gateway. Cream’s fan when into hyperdrive but otherwise it was effective.

Third experiment was a fireball. Decided to reboot Cream and verify that it came up, it is my file server after all. At which point everything ceased working and regardless of actions taken, Rimuru can’t get a response from Cream. The only way that Rimuru seems to regain network access through Cream’s Ethernet port is to break out Shion, remote to Cream over the wireless, and toggle the sharing property off and on again on the Wi-Fi interface.

So I think I can say that ICS is a good ad-hoc solution. The kind where you’re in a closet and need an Ethernet to wireless thing and don’t have a Raspberry Pi handy, except most laptops no longer come with wired network adapters. Having failed the great reboot test, I am declaring it ineffective for my purposes versus switching to a mesh network.

Ya know, I’m reminded that letting Cream remain on Windows 10 instead of wiping it out for a load of FreeBSD or Debian was in itself an experiment. Yeah, I guess asking more than Plex and SMB was too much for this experiment. But I suppose I’ve gotten a lot of use out of this little NUC. Even if there’s been plenty of times I’ve wanted to turn it into creamed corn.

The L-Shaped Beast

The new desk is now largely operational, although it’s going to be a while before a proper chair mat arrives.

It’s amazing having the room to pull the desk out enough to be able to access cables. Not to mention, you know, having an actual study instead of a tiny ass desk crammed next to a couch or a bed.

I’ve made use of my host of binder clips to help secure cabling and keep it mostly out of sight, while taking advantage of the space to have a much easier time swapping cables between Rimuru (desktop) and Shion (laptop). Unlike my old desk, the monitor arm even has enough room to clamp on the desk instead of getting all edgy, lol.

Amusingly to me, Rimuru doesn’t connect to Wi-Fi despite it being a feature of the replacement motherboard. So for right now the gaming focus will likely remain on Steam Deck. In any case, Shion is happy as a clam and gets decent 5 Ghz everywhere in the building. In the long term, I might see if my aging Asus plays nice with younger models since it supports their mesh mode or just wait until Wi-Fi 7 is a thing.

Anyway, I’ve rather missed having an actual desk space with mouse, monitor, and keyboard. Shion and Nerine serve me well for most computer tasks outside of gaming, but there are times where the dire lack of places to sit and work is irksome. For bonus points the L-shaped beast provides ample room for all my stuff and provides a similar layout to my space at work.

Wi-Fi signal strength

In Googling about how devices classify the signal strength to the number of bars on a Wi-Fi icon, I came across this wonderful page from Dong Knows Tech. I think that I might clip this to my notes for ease of sharing, as it gives a nice balance between what someone may want to know without going to deep into the maths. Which over the years, I’ve mostly learned to just pay attention to as doubling and halving of power, because smarter people than me created radio stuff. That page also gives good re-enforcements for the less savvy, that while more dBm is better the difference between two values in dBm isn’t a straight line: it’s curvy. I especially like how it explains the difference between broadcast power and received signal, because most normal people don’t use negative numbers as often as us code monkeys do.

Also was helpful for me since I’ve now learned how to bring up the data on my Mac without having to pop over to system information. Sitting in my dining area off the kitchen, Shion gets a respectable enough -74 to -77 dBm — now consider, my 10 year old Asus is literally across the house and on the second floor. Making me at the furthest point from my router that doesn’t involve sitting in front of the fire place or stepping outside onto the patio. My dining area is actually the worst point inside despite the fireplace being further from the router, because the stairs and kitchen cabinets lay in between: that is to say, my fireplace has better line of sight but worse distance to my router; my dining area has shorter distance but more obstacles in terms of pipes, studs, and drywall, and you know an actual floor/ceiling instead of just looking over the upstairs railing. The dBm value is a good read of this, as standing in front of my fireplace, Shion reads about -62 to -67 dBm.

For me, I’m finding that the 5 Ghz Wi-Fi 5 (802.11ac) from my old RT-AC68 is good enough that I am getting usable signal virtually everywhere in my home. On the 2.4 Ghz band, devices show full Wi-Fi bars pretty much everywhere when scanning for networks. Most of my devices are on 5 Ghz, so I’m finding a lack of reason to shift. I was a little concerned about what the modem’s location would do to signal if my router is next to it.

And then I remembered, I used to have to put an entire apartment building between me and my Asus before my phone’s Wi-Fi crapped out ๐Ÿ˜‚

ARM Power

When you realize you haven’t charged your laptop in more than a week and it still has half a charger left.

Stick that in your x86, Intel!

Sometimes I wonder

Growing up finances were often tight. Whenever I hear the Juston Moore song “We Didn’t Have Much” and it’s lyric that “We had it all when we didn’t have much”, the break in the stanza often makes me think back. My family didn’t have much and certainly didn’t have that song’s kind of “All” during my formative years, but we had all the things. I always found it amazing as a kid that despite how tight things were, we had 3 TVs and 3 VCRs, which as a little boy seemed an order of magnitude more wealthy than we were by a long shot. Of course the way that worked out is a lot of our things were often rent to own or the bank of grandma, and mine were often hand me down. I didn’t care that my VCR was probably the first one pa bought back in the ’80s, it was just awesome sauce being able to watch VHS off in my own corner. When I was older, I found it more amazing that we had so much given what my mother had to work with. That’s the kind of way it was.

When I got to be older, I noticed the affects of this when observing others. As a teenager, I had come to the conclusion that my willingness to spend $1 was probably closer to how willing most folks I knew were willing to spend $20. Since we had little to work with it was often imperative to spend it wisely, especially for big stuff. Because if we screwed up there might not be the option to take it back or buy another. A lot of times the only options were the cheaper ones and the worse deals, but we still had little cause to complain. Like my first laptop: I had the third cheapest laptop at Best Buy because the cheapest was sold out and the second cheapest couldn’t run FreeBSD. Despite that, I loved that laptop and used it for about six years and a lot of my early programming.

Sometimes I wonder about how this has affected my mentality as an adult. Actually, I think my current laptop best reflects how child hood affected my purchasing decisions. Shion is actually the most expensive laptop that I’ve ever bought. It was very carefully planned and budgeted for. It was very carefully decided how much the cost was worth it to me versus the value for those dollars. Kind of like my dad, I don’t have a problem spending an inordinate amount of money on something to solve a problem, but like my widowed mother, I learned to spend it well when I do.

I also developed a metric for factoring into these sort of problems: value over time. It’s kind of like amortization but the formula is simpler, since there’s no loan interest. When shopping for my laptop, I tallied the cost of the various configurations and its value to me. Then I broke it down based on how many years I might use the system: 3 years, 5 years, 7 years, or 12 years. From experience over the years, I know that the average time I will use a computer for is approximately 6 years. It may be a few years less or a few years more but about 6 years is the average. So, to make it a good deal the value had to be a good deal for the 5 years mark and an acceptable deal for the 3 year mark, and at least balance out by the end of the decade.

Likewise over the years, I developed a concept for obsoleteness of computers. If you buy the cheapest laptop you can get from the current hardware and will use it heavily: it will probably be worth buying a faster cheapest computer in another year or two. By then, you’ll often pass the point where doing a task very frequently becomes enough bottleneck that being able to do that task faster is worth the upgrade costs. Accordingly the opposite is true but with different numbers: buy the fastest machine you can get, and in about 10 years it will be about as good as that ‘cheapest’ option will be, if you replace it with the cheapest machine a decade later. That balances out with the average time I use computers, which is in 5 years in enough things will have changed that if the system isn’t enough of a bottleneck to be worth replacing yet, it will be soon therefore start planning; and if it’s already a bottleneck, start planning.

Moral of this planned obsoleteness is don’t be first and don’t be last to upgrade; rather upgrade when the improvements are worth it. And if everything goes sideways in about ten years, whatever you can afford won’t be any worse than a ten year old computer, lol.

Shion has now been in service for approximately 1 year. So far, it’s proving to be quite effective with no sign of retirement on the horizon. Based one earlier calculations a year ago, in another year it will have proven to be an ok deal; next year it will have proven to be a good deal; by the third year it will be a great deal; by 5 years, I’ll definitely have gotten my money’s worth. Here’s hoping that I don’t drop it out a window or sit on it by mistake ๐Ÿ˜‚.

An Experiment In Notes

When I originally tried Evernote a long assed time ago, I didn’t really care for it because I was seeking a solution for my non-homogeneous network and disliked the lack of structure. But when the 90% of use cases were an Android tablet, they eventually one the war and displaced my previous solutions. In the end things worked out quite swell and its data model has fit my style of digital brain quite nicely. Twelve years later, I’ve stuck through Evernote’s more lack luster periods and high points, but I’m a little less enthusiastic about the recent transition.

As such, I decided to conduct an experiment that I’ve been thinking of for a while: which is investigate runners up. In this case, Apple Notes. But I’m afraid to say that it appears to be a washout for my use cases.

Much like a younger version of Evernote, I view Apple Notes as a kind of “Meh, good enough” experience. Both offer a more word processor than semantic experience. Level of detail are formatting like headers rather than sections, and the common formatting yada, yada. Outside of differences like Notes offering short cuts like shift+cmd+h and Evernote ‘# and your text’ as alternatives to the GUI, that’s mostly distinctions in taste and finer details. The typical stuff is all there.

I personally dislike that Notes uses inline hash tagging rather than separate metadata given its use of a database oriented storage model, and prefer Evernote’s handling of attachments. But neither is a hill to die on. For a great majority of tasks, I don’t think the differences are enough to moan about beyond preference, so I’d mostly say: use whichever you like, or whichever works best for you.

The parts where the experiment fails for me is performance: Notes is slow.

As an initial test case, I imported most of my Evernote data and used this opportunity to update my local backups with fresh ENEX exports. Notes supports importing Evernote’s native export format of ENEX which made it the first candidate for experiment. And it even performs fairly well importing large numbers of notes. I decided to collect data under an “Evernote Imports” folder to serve as the root of recategorizing my notes, and that’s where the first failure point comes into play. Dragging and dropping lots of notes or a folder with lots of notes to a new destination is SLOW. Performance of folders on the order of 40 to 300 notes is slow. The kind of slow where you see Apple’s spinning rainbow (Mac’s take on Microsoft’s hour glass of yester-year) for 30 seconds and then walk off for a fresh glass of water. Based on the experiment, I believe this has more to do with folders that contain many attachments more so than many notes in general, as it goes executing a rather bulky database transaction. To be fair this isn’t a common occurrence, as I’m more prone to moving handfuls of notes than entire “Notebooks” worth unless I’m reorganizing and cleaning out my notes, which I typically do every few years. Less excusable however is the sync. For comparison, Iโ€™m used to initial syncs of Evernote taking some hours. Notes on the other hand was a screw it / going to bed / still not done in the morning, level of sync performance syncing to my tablet. Likewise, opening Notes after a long while equals a โ€œHuhโ€ level of slow and the sync and Iโ€™m finding that often folders arenโ€™t in the correct location after it finishes. For me, thatโ€™s a deal breaker.

In my case, Evernote represents just over 3700 notes and exports to somewhere between 2.5 and 3 GB of ENEX files. My test subset is more like 2200 notes, so the strain on iCloud should be considerably less give or take the database overhead. Perhaps this is a lot more notes than the typical user, but for me Iโ€™m finding the performance enough to preclude Apple Notes as an Evernote replacement โ€” Evernote handles sync just fine while Notes chokes.

Safari scrolling to the bottom

One of the things that has pissed me off lately, is Safari deciding any press of the down arrow key should keep scrolling ad-infinitium to the bottom.

Solution: https://apple.stackexchange.com/questions/459274/the-up-and-down-arrows-not-working-correctly-in-safari

I’m just going to assume there was a reason for this feature in the first place that doesn’t involve stupidity.

digiKam databases

Well, this is nifty. According to the documentation, digiKam supports using MySQL/MariaDB as a backend as an alternative to local SQLite files. Plus it documents the constraints relevant for using digiKam across multiple computers with respect to databases and collections.

On the whole, I’ve found the documentation pretty good and comprehensive. Sometimes the English feels a little off once in a blue moon. But the docs are pretty solid. I guess between 17 years of active development and growing professional grade feature sets, I should have expected the docs to be worth more than five minutes.

Good on you, digiKam contributors!

Thoughts on photo management

Along with taking the day for mental health and generally trying to be sane. I’ve been thinking about the future of how my photos are managed and how that needs to evolve.

The present system is pretty much this:

  1. Photos are cached to preferred cloud storage (+2 copies).
    • One is cleared periodically ‘en mass’ after draining.
    • One is cleared periodically during ‘archiving’.
  2. Photos are archived to my file server (+3 copies).
    • Master copies under my Plex media library.
    • Periodically backed up to another local location.
    • Entire file server is backed up locally.
  3. Photos are archived unfiltered to cloud storage (+1 copy).

Now, there’s a few problems with this scheme. Aside from getting off my butt closer to quarterly or yearly than monthly to drain cached images into the master. Over the years the definition of 3 has changed a bit. Another problem has been the evolution of format: I’ve generally migrated from classic JPEG to HEIC, as I’m seeing on the order of 50% disk savings. But of course Plex doesn’t speak HEIC, and therefore viewing outside of mounting the network drive hasn’t worked in years!

I don’t think there’s a good solution to how often I process photos through this pipeline, relative to any other habitual behavior.

There’s also the fact that whether I am draining the cache or actively looking for images, such as building my ‘Remembering Corky’ or ‘Photo Frame’ albums, that doing this at OS level kind of sucks. Explorer and Finder have actually gotten pretty good at dealing with photos since circa 2000, but aren’t exactly fun. More than once I’ve wished for something like Geeqie that my previous Unix machines had. In suffering the native tools, I found that building my Photo Frame album was really damn painful in finder’s gallery view, until I decided to just copy everything to a memory card and go through a process of deleting whatever I don’t want to move.

Actually, the general work flow and process has sucked enough that I’ve considered writing a bit of software to help compensate, or transitioning my master copy into something more cloudy and photo centric. Something that can offer better navigation / movement than a file-centric manager and a little bit more database goodness than my Photos/${YEAR}/${COLLECTION}/ approach to on disk storage.

Then in putzing around Steam Deck, taking its desktop mode for a test drive made me remember an old KDE application called digiKam. It has features for basically everything but pulling free disk storage out its digital back oriface.

In the old days, I never messed around with digiKam. Partly because it and KDE, were kind of heavy weight on my laptop back when I was a KDE user. Partly because by the time digital cameras and smartphones were part of my life, I had no KDE systems and an increasingly heterogeneous computing environment.

I’m thinking that digiKam may be a good solution to the solvable problems. It certainly should be able to handle my photos archive, which is over 40G and 14,000 image and video files. Actually, when the heck did this get so large? It feels like just a lustruum ago, I could fit everything on one Blu-ray layer ๐Ÿ˜†. Actually, maybe I should run WinDirStat or Grand Perspective over that. On the flip side, digiKam will probably offer much of the goodness I remember Google+ Photos having back when I used that. For me personally, being both cross platform and open source are huge pluses. It’s also helpful that it is one of the more cross platform KDE applications, as KDE off Linux/*BSD has become a thing.

As far as I can tell, there’s two problems to this plan.

Problem one is the file wrangling. My photos will remain on my file server with its redundant 8 TB of storage, and the SQLite databases of digiKam are best kept locally. This means that it will need its own backup management. A simple path is using my Mac and its Time Machine destination for that. How well sharing digiKam’s database files across different systems, I’m not sure, but in any case the trend has been for me to prefer one set of muscle-memory.

Problem two is transient image management. See, most of what I do with images fall under two categories: either my master repo, or some pipeline stage denoted above; or ‘a directory full of stuff I want to peruse’. I’m not sure that digiKam really handles that perusal factor. One of the things that I liked about running Debian and FreeBSD on my laptops, was being able to throw geeqie at that problem. Although, it might be viable to just create a staging area and export things.

In any case, it’s looking like digiKam is probably the best non-proprietary solution for the photo management hoopla that doesn’t involve me writing code to scratch itches.