To archive or not to archive: that is the question!

In a way I wonder what is kind of more sad.

That for all these years that I have had my e-mail archives well organized. E.g. different labels for different types of messages being archived. It’s a very clean system.

Or that it’s probably taken more than a decade for my laziness to tell me “Meh, why don’t you just use one archive directory.”

Because when I actually do hunt for old messages there’s this thing called “Search” that tends to work well enough that I don’t need to refine by ‘where’ I archived. And the amount of taps to move messages to xyz archive consumes more time per year than smacking archive or how often I pull up old messages per decade.

Yes. I have a feeling lazy may win out.

Forbes: The New Pinebook Pro Will Challenge Google Chromebooks For $199.

Can’t say that I’ve ever cared a lot about Rockchip’s SoCs but that actually sounds pretty damned tempting. The chipset should deliver a really nice bang for the buck. Very tempting indeed!

At least as far as 14″ laptops with anything weaker than a Core i7 and 32 GB of RAM can go, and for those you would have to shift the price tag over a decimal place. Let’s just say for $200: you have no right to complain about the horse power that a Cortex A72/A53 like that can deliver.

Intel’s chipsets targeting that $200 price point tend to struggle just playing my music and opening tabs without stuttering. In fact getting tired of that is the number one reason my Chromebook will likely get retirement this year or next year.

Hmm, I wonder for the hell of it how bad the Graphics driver is for the RK3399’s GPU. Last time I tried a Mali it was on an Exynos 5 and I was most unimpressed by the desktop graphics performance. But that was quite a few generations ago.

Engadget: Engadget readers love the iPad Pro 12.9.

I find it curious that as things has marched in there is really two kinds of people. Those who think that tablets are garbage and those that love their tablets.

In both cases people trying to push the software tend to miss PC’isms that they take for granted or just learn to achieve the same things in different ways.

Damned temptations

Things that tempt me: the latest XPS 15.

At this point my desktop, Centauri, is old enough that if you buy any variant of Core i5 or i7 laptop: the crunch power will exceed it. Let’s just say I’ve gotten Every, Last, Dollar out of my 3570K and I’m finding its limits as the years go on. I built the system back in 2012~2013 with the plan for it to retire around 2018. Officially it’s re-issued expiration date is 2023 based on it being decrepit for my use case by then.

From a bit of poking around: I imagine between my desktop’s CPU bottleneck and how old my GTX 780 actually is that such a laptop model GTX 1650 is probably powerful enough for my gaming needs. Close enough that I don’t need to spend for a Thunderbolt eGPU dock for the old war horse until bottlenecks actually show up. That counterbalances the sales on the previous chipsets. Games like RE7 and FF15 kinda cause low level fractures in Centuari’s Aging Kick Ass Factor.

Likewise there’s a fairly viable jumping off point for the fifteen inchers. The heavier assed 15s maintain dual RAM slots and a full sized SATA bay. Where as the lighter 13s only have the M.2 and soldered RAM. Thus allowing me to reuse the Centauri’s SATA SSD and its whopping 1 TB of game / video storage. Migrating to 32 GB of RAM is something I fear will happen within the next ~5 years based on my experiences on machines with 16, 12, 8, 4, 3, and 1 GB of RAM that I still use. Which means if I bought an XPS 13 the minimal capacity would be 16 GB.

So sadly the 15″ models become far more attractive to me than the 13″ models. Because I don’t want another couple year and its crap device, if I’m paying that much again: I want a use until fall apart device.

That said, I don’t really like the idea of a laptop that weighs almost 2 kg as much as I’d rather one weigh closer to 0.5 kg, but getting such performance in that light a package is at least a decade or two away for PCs >_<.

The concept of paying off such a war beast makes me groan. But on the other hand Centauri is already past her retirement age. And provided no pancaking: it would be a war beast that could both replace Centauri for the next ~5 years while deep sixing the choke point of the beater I use for work.

My Chromebook 3 cost me about $50, as a decent machine for my lab bench. But it is limited by the dinky CPU which leaves me groaning far to often as the machine struggles to keep up with my flow of terminals, emails, and tabs. So much so that I already delegate a lot to my my more powerful Android tablet, with its cracked screen. Which leads me towards using my development laptop, Stark as a bench box rather than kept safely on my desk. Because while Stark is about as old as Centauri, a a 3360M totally nukes the crap out of an N3060 any day.

Heavy lifting usually lands upon Centauri and Stark. The difference is the non compiling code all day tasks are both what Centauri does and where its weaknesses are growing. I imagine that Stark would remain the development system, and that Centauri would replace my file server, Cream; or end up donated to the office.

Ahh, so much to plan.

Solving the wrong problem?

Programming language Python’s ‘existential threat’ is app distribution: Is this the answer?

I kind of can’t help but wonder if this is really about solving the wrong problem.

In dealing with the developer side of things: pip and venv really aren’t that bad compared to some of the squirrelly means of distributing software the world has known. But much beyond ‘type pip install xxx and cross your fingers’, I wouldn’t really call it a user oriented system. It works well enough for Python developers but is not catered to Joe Average User or twelve year olds who just want to blow stuff up.

To make things ease on end users of course: you have to solve the actual problem. Linux has a good rule about not breaking userspace–but userspace doesn’t care about you! Personally I think that is the real pickle.

Over in NT land it’s pretty simple. You build some shit and the system ABI is probably the same across a bunch of Windows versions if you’re not too crazy, and most of the baggage is the problem of your install creation process. Whether that’s some fancy tool or some rules in your Makefile. It’s impressive when you load up a video game that’s old as hell and it just works, despite being almost old enough to buy a beer. It wasn’t made to be efficient: it evolved to become stable. It grew up in a world where people shipped binaries and pushing changes to users was costly.

Now by the time you have an actual Linux desktop distribution: all bets are pretty much off. A decent one will usually maintain a viable ABI for a major release number but that doesn’t mean everything will be compatible forever, nor does it mean the binary dependencies you require will remain in xyz package repo for the next fifty years. Some of this lands on distributions and how they deal with package management to squeeze binaries into nixie norms of hierarchy. Some of this also lands on developers, who may or may not know what an ABI is from a hole in the ground because they’re used to recompiling to APIs and configuring ten thousand build time knobs and don’t care that changing something impacts binary compatibility between their library and the works of others.

There are reasons why things like AppImage and Flatpak exist. Many of these I think owe to the source centric nature of unix systems. Different communities have different norms of sharing and reuse.

When I began learning unix systems, I chose a source centric flavour that would let me learn how things worked under the hood. The kind where you waited three and a half days because a new version of KDE or GNOME landed and many a dependency in the food chain needed to be rebuilt. The kind where you learned to diagnose linker problems and grumble knowing that changes to library X meant recompiling half your environment if you wanted to be sure your applications didn’t combust quietly in a corner just waiting for the day you actually needed to launch them again, or curse at some major framework linking to some piddly library that triggered same.

In the end my times with that system were dominated by two things: stability and compile times. But I didn’t chose that in order to have an easy one click and done system. I had chosen it because I wanted to learn how computers worked and develop the means of figuring out why the fuck programs broke. Today if you use that flavour of unix, you can pretty much live a purely binary world that wasn’t so easy when I was a padawan.

By contrast an acquaintance of mine back then, ironically a Python programmer, had chosen a more widely known distribution that focused on having the latest binaries available without having to compile all the things. One that’s still quite popular about ~15 years later. Let’s just say the usability of binary focused distributions has improved with time despite the warts that is binary distribution in *nix land. Or to summarize it thusly:

When it came time for a major version upgrade: I spent a few days compiling before getting back to work. He spent a few days cursing and then reformatted, lol.

Lifehacker: The Raspberry Pi 4’s Most Interesting Quirks.
https://lifehacker.com/the-raspberry-pi-4s-most-interesting-quirks-1835871780
Sounds like everything I found bothersome has been improved, except for trying to do any kind of serious video playback without using omxplayer.

MSPoweruser: Bill Gates calls losing the smartphone market to Android his “greatest mistake”.

In times equally long past I would have guessed it smart phones became a thing: they’d probably run something derived from Windows CE or be a tad Palmy. Meanwhile what I would have wanted back then would probably had looked something like PDAs from Sharp and Nokia.
Today, be it for better or worse: Android is insanely successful. It is to mobile what Windows is to PC in just about every sense of the word. Except it grew up with underpinnings built around Linux and Java rather than DOS and C. It also has had the benefit of hindsight: as those creating it came from a world populated by Windows and Unix based systems.
Microsoft is doing pretty well today. Hell they’ve even made some nice software and services. But they are dead as the platform for mobile. Want to use Microsoft things on the go? Your phone runs probably Android. If not well it sure won’t be running a Microsoft operating system!
For about the last five years or so, I’ve kind of wondered if in about another thirty years: will Android eventually overtake Windows. Much as Microsoft pretty much became the gold standard as Unix vendors fiddled and burned.
One thing is for sure: the Linux kernel ain’t disappearing anytime soon 😜.

Chrome Unboxed: Google Assistant Could Arrive On All Chromebooks In Late Summer.

For the most part I’ve been done with Google Assistant, and lax in using such tools. But I wouldn’t mind seeing them in more places as an option.

My relationship with voice tools tend to take two forms: pressing the microphone button on my phone and sighing at Google’s failures to handle my reminders and pressing the microphone button on my remote and asking Alexa to launch something I want to resume watching on Fire TV.

It’s nice to have options even if most of the options have failings.

Even my desktop is able to use Alexa and Cortana without much effort; Google Assistant not so much. But of course none really do that much that I find useful in that machine.

Ars Technica: A tale of two cities: Why ransomware will just get worse.

https://arstechnica.com/information-technology/2019/06/a-tale-of-two-cities-why-ransomware-will-just-get-worse/

Personally I think it a bad precedent to pay such a random. The odds of such things happening probably go up when they turn out to be profitable rather than purely gloat factors.

However I do understand that not everyone goes into writing a disaster recovery plan starting off with the statement, “Oh fuck!, My computer just exploded!” as the baseline. I can just imagine how peachy most town and county networks would fair….. like a flaming car on a roller coaster track: descending into a pit of kerosene.