A little Ray of sunshine after the disappointment of the Tab S5e

Exclusive: This is the Galaxy Tab S6 and it has a dual camera.

A little Ray of sunshine after the disappointment of the Tab S5e.

The return to 16:10 is still as disappointing as it was when the Tab S4 came out, but nothing compared to the saddness of my Tab S3 having a cracked screen and the latest model launching with no pen and mid ranged specs.

I do have to admit though, in terms of hardware it’s becoming harder to justify Samsung’s tablets over Apple’s. Today it’s mostly the fact that behind here since the era of the Steak 7 and EeePad Transformer–I know Android fulfills my software needs with flying colors. While I still expect iOS to make me grumble and groan.

My upgrade path has been looking rocky. Since the crack, the only Android option has been the Tab S4 which is already aged a bit. Over in fruit co land there’s at least the 11″ and 10.5″ iPad Pro models as viable successors.

In reality of course I’d just like my screen’s crack not to expand for another year or two…. Lol.

Looking through Google Play’s editors choice sections, I’m happy to see ArtFlow listed.

Think I’ve had this on my devices since the days of the Note series tablets. It’s become about the only drawing app on Android that I tend to care about.

BGR: Bill Gates explains why he couldn’t duplicate Steve Jobs’ magic ‘spells’.

“I was like a minor wizard because he would be casting spells, and I would see people mesmerized, but because I’m a minor wizard, the spells don’t work on me,” 

Gates said, according to advance news of his remarks reported by B/oomberg. 

While the minor wizard bit might seem modest but I’ve got to admit: it strikes me as an excellent way of putting it. In a way it also makes me wonder how many table top RPGs Bill Gates may have played over the years, lol

To archive or not to archive: that is the question!

In a way I wonder what is kind of more sad.

That for all these years that I have had my e-mail archives well organized. E.g. different labels for different types of messages being archived. It’s a very clean system.

Or that it’s probably taken more than a decade for my laziness to tell me “Meh, why don’t you just use one archive directory.”

Because when I actually do hunt for old messages there’s this thing called “Search” that tends to work well enough that I don’t need to refine by ‘where’ I archived. And the amount of taps to move messages to xyz archive consumes more time per year than smacking archive or how often I pull up old messages per decade.

Yes. I have a feeling lazy may win out.

Forbes: The New Pinebook Pro Will Challenge Google Chromebooks For $199.

Can’t say that I’ve ever cared a lot about Rockchip’s SoCs but that actually sounds pretty damned tempting. The chipset should deliver a really nice bang for the buck. Very tempting indeed!

At least as far as 14″ laptops with anything weaker than a Core i7 and 32 GB of RAM can go, and for those you would have to shift the price tag over a decimal place. Let’s just say for $200: you have no right to complain about the horse power that a Cortex A72/A53 like that can deliver.

Intel’s chipsets targeting that $200 price point tend to struggle just playing my music and opening tabs without stuttering. In fact getting tired of that is the number one reason my Chromebook will likely get retirement this year or next year.

Hmm, I wonder for the hell of it how bad the Graphics driver is for the RK3399’s GPU. Last time I tried a Mali it was on an Exynos 5 and I was most unimpressed by the desktop graphics performance. But that was quite a few generations ago.

Engadget: Engadget readers love the iPad Pro 12.9.

I find it curious that as things has marched in there is really two kinds of people. Those who think that tablets are garbage and those that love their tablets.

In both cases people trying to push the software tend to miss PC’isms that they take for granted or just learn to achieve the same things in different ways.

Damned temptations

Things that tempt me: the latest XPS 15.

At this point my desktop, Centauri, is old enough that if you buy any variant of Core i5 or i7 laptop: the crunch power will exceed it. Let’s just say I’ve gotten Every, Last, Dollar out of my 3570K and I’m finding its limits as the years go on. I built the system back in 2012~2013 with the plan for it to retire around 2018. Officially it’s re-issued expiration date is 2023 based on it being decrepit for my use case by then.

From a bit of poking around: I imagine between my desktop’s CPU bottleneck and how old my GTX 780 actually is that such a laptop model GTX 1650 is probably powerful enough for my gaming needs. Close enough that I don’t need to spend for a Thunderbolt eGPU dock for the old war horse until bottlenecks actually show up. That counterbalances the sales on the previous chipsets. Games like RE7 and FF15 kinda cause low level fractures in Centuari’s Aging Kick Ass Factor.

Likewise there’s a fairly viable jumping off point for the fifteen inchers. The heavier assed 15s maintain dual RAM slots and a full sized SATA bay. Where as the lighter 13s only have the M.2 and soldered RAM. Thus allowing me to reuse the Centauri’s SATA SSD and its whopping 1 TB of game / video storage. Migrating to 32 GB of RAM is something I fear will happen within the next ~5 years based on my experiences on machines with 16, 12, 8, 4, 3, and 1 GB of RAM that I still use. Which means if I bought an XPS 13 the minimal capacity would be 16 GB.

So sadly the 15″ models become far more attractive to me than the 13″ models. Because I don’t want another couple year and its crap device, if I’m paying that much again: I want a use until fall apart device.

That said, I don’t really like the idea of a laptop that weighs almost 2 kg as much as I’d rather one weigh closer to 0.5 kg, but getting such performance in that light a package is at least a decade or two away for PCs >_<.

The concept of paying off such a war beast makes me groan. But on the other hand Centauri is already past her retirement age. And provided no pancaking: it would be a war beast that could both replace Centauri for the next ~5 years while deep sixing the choke point of the beater I use for work.

My Chromebook 3 cost me about $50, as a decent machine for my lab bench. But it is limited by the dinky CPU which leaves me groaning far to often as the machine struggles to keep up with my flow of terminals, emails, and tabs. So much so that I already delegate a lot to my my more powerful Android tablet, with its cracked screen. Which leads me towards using my development laptop, Stark as a bench box rather than kept safely on my desk. Because while Stark is about as old as Centauri, a a 3360M totally nukes the crap out of an N3060 any day.

Heavy lifting usually lands upon Centauri and Stark. The difference is the non compiling code all day tasks are both what Centauri does and where its weaknesses are growing. I imagine that Stark would remain the development system, and that Centauri would replace my file server, Cream; or end up donated to the office.

Ahh, so much to plan.

Solving the wrong problem?

Programming language Python’s ‘existential threat’ is app distribution: Is this the answer?

I kind of can’t help but wonder if this is really about solving the wrong problem.

In dealing with the developer side of things: pip and venv really aren’t that bad compared to some of the squirrelly means of distributing software the world has known. But much beyond ‘type pip install xxx and cross your fingers’, I wouldn’t really call it a user oriented system. It works well enough for Python developers but is not catered to Joe Average User or twelve year olds who just want to blow stuff up.

To make things ease on end users of course: you have to solve the actual problem. Linux has a good rule about not breaking userspace–but userspace doesn’t care about you! Personally I think that is the real pickle.

Over in NT land it’s pretty simple. You build some shit and the system ABI is probably the same across a bunch of Windows versions if you’re not too crazy, and most of the baggage is the problem of your install creation process. Whether that’s some fancy tool or some rules in your Makefile. It’s impressive when you load up a video game that’s old as hell and it just works, despite being almost old enough to buy a beer. It wasn’t made to be efficient: it evolved to become stable. It grew up in a world where people shipped binaries and pushing changes to users was costly.

Now by the time you have an actual Linux desktop distribution: all bets are pretty much off. A decent one will usually maintain a viable ABI for a major release number but that doesn’t mean everything will be compatible forever, nor does it mean the binary dependencies you require will remain in xyz package repo for the next fifty years. Some of this lands on distributions and how they deal with package management to squeeze binaries into nixie norms of hierarchy. Some of this also lands on developers, who may or may not know what an ABI is from a hole in the ground because they’re used to recompiling to APIs and configuring ten thousand build time knobs and don’t care that changing something impacts binary compatibility between their library and the works of others.

There are reasons why things like AppImage and Flatpak exist. Many of these I think owe to the source centric nature of unix systems. Different communities have different norms of sharing and reuse.

When I began learning unix systems, I chose a source centric flavour that would let me learn how things worked under the hood. The kind where you waited three and a half days because a new version of KDE or GNOME landed and many a dependency in the food chain needed to be rebuilt. The kind where you learned to diagnose linker problems and grumble knowing that changes to library X meant recompiling half your environment if you wanted to be sure your applications didn’t combust quietly in a corner just waiting for the day you actually needed to launch them again, or curse at some major framework linking to some piddly library that triggered same.

In the end my times with that system were dominated by two things: stability and compile times. But I didn’t chose that in order to have an easy one click and done system. I had chosen it because I wanted to learn how computers worked and develop the means of figuring out why the fuck programs broke. Today if you use that flavour of unix, you can pretty much live a purely binary world that wasn’t so easy when I was a padawan.

By contrast an acquaintance of mine back then, ironically a Python programmer, had chosen a more widely known distribution that focused on having the latest binaries available without having to compile all the things. One that’s still quite popular about ~15 years later. Let’s just say the usability of binary focused distributions has improved with time despite the warts that is binary distribution in *nix land. Or to summarize it thusly:

When it came time for a major version upgrade: I spent a few days compiling before getting back to work. He spent a few days cursing and then reformatted, lol.