Engadget: Engadget readers love the iPad Pro 12.9.

I find it curious that as things has marched in there is really two kinds of people. Those who think that tablets are garbage and those that love their tablets.

In both cases people trying to push the software tend to miss PC’isms that they take for granted or just learn to achieve the same things in different ways.

Damned temptations

Things that tempt me: the latest XPS 15.

At this point my desktop, Centauri, is old enough that if you buy any variant of Core i5 or i7 laptop: the crunch power will exceed it. Let’s just say I’ve gotten Every, Last, Dollar out of my 3570K and I’m finding its limits as the years go on. I built the system back in 2012~2013 with the plan for it to retire around 2018. Officially it’s re-issued expiration date is 2023 based on it being decrepit for my use case by then.

From a bit of poking around: I imagine between my desktop’s CPU bottleneck and how old my GTX 780 actually is that such a laptop model GTX 1650 is probably powerful enough for my gaming needs. Close enough that I don’t need to spend for a Thunderbolt eGPU dock for the old war horse until bottlenecks actually show up. That counterbalances the sales on the previous chipsets. Games like RE7 and FF15 kinda cause low level fractures in Centuari’s Aging Kick Ass Factor.

Likewise there’s a fairly viable jumping off point for the fifteen inchers. The heavier assed 15s maintain dual RAM slots and a full sized SATA bay. Where as the lighter 13s only have the M.2 and soldered RAM. Thus allowing me to reuse the Centauri’s SATA SSD and its whopping 1 TB of game / video storage. Migrating to 32 GB of RAM is something I fear will happen within the next ~5 years based on my experiences on machines with 16, 12, 8, 4, 3, and 1 GB of RAM that I still use. Which means if I bought an XPS 13 the minimal capacity would be 16 GB.

So sadly the 15″ models become far more attractive to me than the 13″ models. Because I don’t want another couple year and its crap device, if I’m paying that much again: I want a use until fall apart device.

That said, I don’t really like the idea of a laptop that weighs almost 2 kg as much as I’d rather one weigh closer to 0.5 kg, but getting such performance in that light a package is at least a decade or two away for PCs >_<.

The concept of paying off such a war beast makes me groan. But on the other hand Centauri is already past her retirement age. And provided no pancaking: it would be a war beast that could both replace Centauri for the next ~5 years while deep sixing the choke point of the beater I use for work.

My Chromebook 3 cost me about $50, as a decent machine for my lab bench. But it is limited by the dinky CPU which leaves me groaning far to often as the machine struggles to keep up with my flow of terminals, emails, and tabs. So much so that I already delegate a lot to my my more powerful Android tablet, with its cracked screen. Which leads me towards using my development laptop, Stark as a bench box rather than kept safely on my desk. Because while Stark is about as old as Centauri, a a 3360M totally nukes the crap out of an N3060 any day.

Heavy lifting usually lands upon Centauri and Stark. The difference is the non compiling code all day tasks are both what Centauri does and where its weaknesses are growing. I imagine that Stark would remain the development system, and that Centauri would replace my file server, Cream; or end up donated to the office.

Ahh, so much to plan.

Solving the wrong problem?

Programming language Python’s ‘existential threat’ is app distribution: Is this the answer?

I kind of can’t help but wonder if this is really about solving the wrong problem.

In dealing with the developer side of things: pip and venv really aren’t that bad compared to some of the squirrelly means of distributing software the world has known. But much beyond ‘type pip install xxx and cross your fingers’, I wouldn’t really call it a user oriented system. It works well enough for Python developers but is not catered to Joe Average User or twelve year olds who just want to blow stuff up.

To make things ease on end users of course: you have to solve the actual problem. Linux has a good rule about not breaking userspace–but userspace doesn’t care about you! Personally I think that is the real pickle.

Over in NT land it’s pretty simple. You build some shit and the system ABI is probably the same across a bunch of Windows versions if you’re not too crazy, and most of the baggage is the problem of your install creation process. Whether that’s some fancy tool or some rules in your Makefile. It’s impressive when you load up a video game that’s old as hell and it just works, despite being almost old enough to buy a beer. It wasn’t made to be efficient: it evolved to become stable. It grew up in a world where people shipped binaries and pushing changes to users was costly.

Now by the time you have an actual Linux desktop distribution: all bets are pretty much off. A decent one will usually maintain a viable ABI for a major release number but that doesn’t mean everything will be compatible forever, nor does it mean the binary dependencies you require will remain in xyz package repo for the next fifty years. Some of this lands on distributions and how they deal with package management to squeeze binaries into nixie norms of hierarchy. Some of this also lands on developers, who may or may not know what an ABI is from a hole in the ground because they’re used to recompiling to APIs and configuring ten thousand build time knobs and don’t care that changing something impacts binary compatibility between their library and the works of others.

There are reasons why things like AppImage and Flatpak exist. Many of these I think owe to the source centric nature of unix systems. Different communities have different norms of sharing and reuse.

When I began learning unix systems, I chose a source centric flavour that would let me learn how things worked under the hood. The kind where you waited three and a half days because a new version of KDE or GNOME landed and many a dependency in the food chain needed to be rebuilt. The kind where you learned to diagnose linker problems and grumble knowing that changes to library X meant recompiling half your environment if you wanted to be sure your applications didn’t combust quietly in a corner just waiting for the day you actually needed to launch them again, or curse at some major framework linking to some piddly library that triggered same.

In the end my times with that system were dominated by two things: stability and compile times. But I didn’t chose that in order to have an easy one click and done system. I had chosen it because I wanted to learn how computers worked and develop the means of figuring out why the fuck programs broke. Today if you use that flavour of unix, you can pretty much live a purely binary world that wasn’t so easy when I was a padawan.

By contrast an acquaintance of mine back then, ironically a Python programmer, had chosen a more widely known distribution that focused on having the latest binaries available without having to compile all the things. One that’s still quite popular about ~15 years later. Let’s just say the usability of binary focused distributions has improved with time despite the warts that is binary distribution in *nix land. Or to summarize it thusly:

When it came time for a major version upgrade: I spent a few days compiling before getting back to work. He spent a few days cursing and then reformatted, lol.

Lifehacker: The Raspberry Pi 4’s Most Interesting Quirks.
https://lifehacker.com/the-raspberry-pi-4s-most-interesting-quirks-1835871780
Sounds like everything I found bothersome has been improved, except for trying to do any kind of serious video playback without using omxplayer.

MSPoweruser: Bill Gates calls losing the smartphone market to Android his “greatest mistake”.

In times equally long past I would have guessed it smart phones became a thing: they’d probably run something derived from Windows CE or be a tad Palmy. Meanwhile what I would have wanted back then would probably had looked something like PDAs from Sharp and Nokia.
Today, be it for better or worse: Android is insanely successful. It is to mobile what Windows is to PC in just about every sense of the word. Except it grew up with underpinnings built around Linux and Java rather than DOS and C. It also has had the benefit of hindsight: as those creating it came from a world populated by Windows and Unix based systems.
Microsoft is doing pretty well today. Hell they’ve even made some nice software and services. But they are dead as the platform for mobile. Want to use Microsoft things on the go? Your phone runs probably Android. If not well it sure won’t be running a Microsoft operating system!
For about the last five years or so, I’ve kind of wondered if in about another thirty years: will Android eventually overtake Windows. Much as Microsoft pretty much became the gold standard as Unix vendors fiddled and burned.
One thing is for sure: the Linux kernel ain’t disappearing anytime soon 😜.

Chrome Unboxed: Google Assistant Could Arrive On All Chromebooks In Late Summer.

For the most part I’ve been done with Google Assistant, and lax in using such tools. But I wouldn’t mind seeing them in more places as an option.

My relationship with voice tools tend to take two forms: pressing the microphone button on my phone and sighing at Google’s failures to handle my reminders and pressing the microphone button on my remote and asking Alexa to launch something I want to resume watching on Fire TV.

It’s nice to have options even if most of the options have failings.

Even my desktop is able to use Alexa and Cortana without much effort; Google Assistant not so much. But of course none really do that much that I find useful in that machine.

Ars Technica: A tale of two cities: Why ransomware will just get worse.

https://arstechnica.com/information-technology/2019/06/a-tale-of-two-cities-why-ransomware-will-just-get-worse/

Personally I think it a bad precedent to pay such a random. The odds of such things happening probably go up when they turn out to be profitable rather than purely gloat factors.

However I do understand that not everyone goes into writing a disaster recovery plan starting off with the statement, “Oh fuck!, My computer just exploded!” as the baseline. I can just imagine how peachy most town and county networks would fair….. like a flaming car on a roller coaster track: descending into a pit of kerosene.

Reflections on my road to becoming a computer nerd

Generally I would say there were about four things that really got me into computers.

  1. Information access.
  2. Word processing.
  3. Video games.
  4. Community

Growing up in an environment where your choice was the family encyclopedias and dictionary: both about as old as me; or waiting a week or two for a trip to the library to actually happen. I was somewhat fortunate in the sense I could checkout books and learn about how nuclear power or jet engines function, and not be worried what people think. Today, I’m not quite sure I’d wanna see the alarmed glares kids might get today at the stuff I read back then but I guess there aren’t that many librarians left either.

Online however made a very different set of information available than the bookstores and library could offer me. Two websites especially: the Gundam Project and the Mecha & Anime HQ. While MAHQ is still around the former went defunct before my family switched from dial up to broadbanned. As my interests exploded I found that increasingly the Internet was the way to gather information. You could go to the used bookstore and get books on Star Trek and Star Wars. You couldn’t find so much related to Mobile Suit Gundam and Macross. Hell the nearest source of anime was probably drive two hours to a Suncoast, and that usually made it both very rare and expensive for our income level.

Likewise as my interests exploded: I generally faced two problems. One is the inefficiency of handwriting all your nerdy documents. Second was how painful corrections were with a typewriter when your spelling is less than 110% of perfection. I don’t think I have even touched one since the 6th grade but correction tape integrated into a typewriter is still among my definitions of wasted time and misery. When I gave the computer a shot at these matters, what those older than I dubbed word processing; my fate was rather sealed. Because between the rapid access to information and the ease of editing text I came to spend inordinate amounts of time in front of a computers.

Once we made the transition into the Pentium 4 era: we finally had a computer worth while for gaming. Well, at least for games that didn’t come on and run from floppy diskette. Early in my childhood we had both a Tandy 1000 and a NES, so I’ve been exposed to video games in one form or another longer than I have been able to read my native language. But most of our computers in between weren’t worth much for games, which generally got dumped on consoles.

The rise of multiplayer gaming pretty much created and defined my social connections outside the meatspace, and that largely remained the only link until I began getting into unix systems and learning programming as a teenager.

Strangely today: video games are still a major point for my computer use. It was around 2007 or so where I hit the point that FreeBSD could replace my XP machines, except for the damned Direct3D gaming pickle. At this point I don’t think I would even have built my desktop if it wasn’t for Steam. My next PC will probably be a laptop and an eGPU rather than a tower.

But that’s really where things intersected with other people.

I was quite active in a few gaming circles, and as my knowledge of computers grew so did my participation in circles built around those topics. Many years later: I still have friends that I met through those circles. Well into my early twenties, I was still very active in various forums and news groups related to my interests. As time has gone on most people have generally moved in the direction of services like Facebook and the late G+, and thus so had I. Today that largely takes the form of Diaspora and the Pluspora pod.

As I reflect upon the road that lead me here: I do wonder whether that is a good or a bad trend. But I think it really owes to two facts. A lot of the social things we do with the Internet are like scraps of paper: detritus and transient. Things like G+ made the ease of integrating people a lot higher than when you had to manage many memberships and connect to dozens of systems but it never changed the fact that most of our output is pretty much digital scraps. These aren’t communities that will last longer than national governments and treasures in a museum: rather the things we post are closer to asking what some Tom, Dick, or Harry had for lunch in the 19th century. It’s all transient at the backbone but we enjoy it while we can.

On the flipside the warehouse of old data on my cold storage drive is rather easier to deal with than stacks of old handwritten and typeset papers. And more than a few of the places I’ve gone have allowed me quite a bit of ease in backing things up, hehe.

Neowin: User concept re-imagines File Explorer with Fluent Design.

https://www.neowin.net/news/user-concept-re-imagines-file-explorer-with-fluent-design/

I’d actually like to see something like that. Generally I’ve come to appreciate the new age UIs that pop up in W10, mostly because I’ve already suffered their design evolution from mobile platforms. Much as I did various desktop horrors from Unix and Microsoft systems.

Windows 10’s file explorer largely keeping the status quo left me with mixed feelings. But the fact remains of you end up suffering a GUI file manager: Microsoft’s is the gold standard to curse at.