The new 2020 iPad isn’t enough for Zoom school

Pretty sure that every time I’ve seen a review of the new iPad pop up there’s been three consistent complaints from reviewers:

  1. Same old design 
  2. 32 GB storage 
  3. Single user.
Personally, I think the complaints are overrated.
On the hardware front: I’d say if it isn’t broken, why replace it? The physical design is no less awesome or crappy than it was years ago. Just now you’ve got sexier models available for twice the price tag!
On my iPad Pro there is usually thirty some gigs of storage usage. At the same time it usually recommends I let it deep six a dozen or so gigs of stuff. Considering the base iPad costs a lot less, and offers even more storage (still for less) as an option: that’s pretty swell for the cheapest iPads.
The place where I nod my head in agreement however is USB-C. I’m hoping that as SoCs trickle down the fruit company eventually goes USB-C all the way. Even if it lead to a Pencil that just replaced the charging connector: I’d call it a win.
For the most part I think iOS has deserved it’s criticism over the years. Slow, terribly slow evolutionary pace but pretty good results. I personally care much more when it comes to the tablet front since my tablet vs phone usage is something like 90% vs 10%, lol.
At this point it’s fair to call iPadOS a multitasking OS. Both in the technical sense, and the user capabilities. Just not as ironed out as what you’ve been doing on your PC since the late ‘80s, lol.
What I find intriguing is the rice of reviewers moaning about iPadOS being a single user operating system.
That iPads are expensive is a given. That outfitting an entire family with Apple products is comically expensive is only avoidable by not doing it. But we still live in a world where sharing computers isn’t as typical as it once was.
Once upon a time: computers were so crazy expensive that time sharing was a key. There were reasons you ran a bunch of terminals to tens of thousands of dollars of equipment, and PCs costing several grand were a joke.
Today your wrist watch probably has more computing power than the old time sharing systems. Yet most people tend to operate as either one PC per person or with a device in a shared location. Ya know the whole P in Personal Computer?
Multiple user accounts on tablets are kind of attractive from the idea of leaving a tablet in community areas with no specific user. But do you really want to pay that much for a dedicated coffee table or kitchen ‘puter?
Tablets like phones tend to be pretty personal, single user devices. Much like DOS PCs of old the reason to share tends to be purpose deployed rather than intent designed.
Plus if you’ve been bitching and moaning that Android tablets are shit and never get updates for so many years, you probably shouldn’t complain about how many years before the cheapest iPads and your old hand me downs are good enough for your kids 😅.
Real people tend to be more pragmatic than nitpicking reviewers and tech blogs. And yes, sometimes you should consider price a driving factor.

I Tried to Live Without the Tech Giants. It Was Impossible.

Most people don’t go to such an extent to avoid the big tech companies, even for an experiment it is a bit super thorough. But makes a solid point.
Critics of the big tech companies are often told, “If you don’t like the company, don’t use its products.” My takeaway from the experiment was that it’s not possible to do that. It’s not just the products and services branded with the big tech giant’s name. It’s that these companies control a thicket of more obscure products and services that are hard to untangle from tools we rely on for everything we do, from work to getting from point A to point B.”
Perhaps the question we really should be asking ourselves is whether or not these companies are a necessary evil.
Would such services exist, or be anywhere near as good without the help of such companies? Miss Hill points out the dominance of Google Maps and the interaction with things like Uber, and I think that’s kind of key. We had GPS navigation long before we had Google Maps and smart phones, but which would you rather use? Part of what made Google Maps what it is today is the insane investment: sending people and hardware off into the wild blue yonder to build a better dataset than simply importing maps and satellite photos could. Who the hell has that much money? Well, Google did. Some clown in their parents garage might be able to kick start the next Apple or Amazon, but they’re not going to be able to afford to run Google Streetview without monopolistic funding.
As things worked out, I’d say Amazon turned out to be a pretty great idea. But twenty six years ago: we’d probably forgive you for thinking Bezos was crazy instead of anticipating he would become several times richer than God, building one of the world’s most well known enterprises along the way.
See, we build our success upon the success of others—and our success is often in enabling others to succeed. The question is can we do that without the ginormous bankrolls and the infrastructure that entails.
I’d like to think we have yet to see the last great American tech company. But without a governmental strongarm, I don’t think we will ever see these empires displaced. Not until landmark paradigm shifts cause them to exit a market, or for profitability sake they choose to exit or destroy one. You’re not going to beat Google Maps unless they’re incompetent and you’re hyper lucky and clever at just the right time: or they choose to shutter the entire operation. That’s just how it works at scale.
Yes, I’m pretty sure that we should refer to them as monopolies. But are they ones we need, or are they ones we can ill afford? As someone who long resisted Google and Facebook, I find that a very intriguing question.

ARM-Based 12-inch MacBook Specs Include A14X Bionic SoC, up to 16GB RAM, 20-Hour Battery Life & More

To me it would make sense if the first Macintosh to sport Apple’s ARM system on a chip was the basic MacBook. It’s entry level enough to support up selling more powerful machines, and down low enough to write it off if the horse power cells to wow everyone’s eyes out of their sockets.
“Looking at these rumored specs, it honestly looks like Apple wants to repurpose the discontinued 12-inch MacBook to sport its own A14X Bionic SoC. Since the A14X Bionic is expected to be made on the 5nm process and not have a ridiculously high TDP, the 12-inch MacBook’s chassis should be sufficient to cool the chip“
Which would also be inline with modern Apple and Samsung devices. Not to mention, if it’s not busted: why fix it?

Apple Watch continues to help save lives in a variety of ways

While I’d doubt a wristwatch will ever replace a hospital for something like an electrocardiogram: I’d like to think that as far as canaries go an Apple Watch is better than feeding it to the cat.
Coming from a family tree where hearts are among leading cause of death, the health features are about the only aspect of the device I find intriguing. But then in lays the real problem: even if you could convince me to pay that much for a watch the fact remains that I do not want an iPhone to go with it.

The Next Phase: Apple Lays Out Plans To Transition Macs from x86 to Apple SoCs

Kind of happy to see this. While I don’t envision Apple ever releasing a Mac that both appeals to me, and falls within my price range, I do very much want to see more “Conventional” computers with ARM processors.
Based on my iPad Pro, and nearly a decade of working Android tablets to death, I think the crossover point works. High end ARM SoCs are up to snuff for missions like the MacBook Air and iMac. For the general computing tasks the problem is more that ARM based PCs aren’t really a thing you can go out and buy. Not so much a lack of horse power. The processors kickass at this point.
Rather the main use case I see for x86 in Apple’s world: is for devices like the Mac Pro. Where uncompromising horse power should be what comes along with that absurd price tag. For regular people, we just want our computers to do our job promptly.
And I’m pretty sure that even the basic iPad far out sells the Mac Pro, lol. For better or worse the demand for Uber powerful computers often go hand in hand with the software packages for highly specific and very resource intensive business tasks. Not Joe Blow checking his email or doing office files.

This is an interesting perspective IMHO. Even more so, as a Pro 11 user: who debates a larger model as an upgrade path in 3-5 years. The physical difference between the 11”, and my 9.7” Samsung isn’t big enough to really mind, but the 12.9” is rather heavier.

I kind of abuse my tablets enough that a larger screen is a plus. But my experiences with 12” tablets has mostly been phrased by too heavy, too expensive, and too big to be a main device. But that’s more to do with 16:10. When I saw the 11” and 12.9” models side by side, I kind of decided the aspect ratio made the two iPads close enough in physical screen size, not to care about the greater size, so much as the greater weight to heft.

For me, coming from a Galaxy Tab S3: I found the 11” Pro a pretty safe bet. Comparable enough in size and weight that it shouldn’t interfere with my couch surfing, or being my bench notebook. In practice the difference is minor, just large enough to make a keyboard case a more practical option than before. While still maintaining that near perfect size that 9.7” tablets achieved.

Having a SoC that should be suitable for about as long as the iOS updates keep coming, or until hardware makes a notable difference, I’m also quite thrilled by the lasting power being closer to my laptop’s life expectancy than my tablets. Well, give or take that my true reason for the hardware change was my Samsung’s screen getting cracked….lol.

Something I also like is Ali’s point about the Pro. Because to be honest, selecting the Pro for me, was more about not rewiring all my things from USB-C to Lightning than about the performance jump from an Air to a Pro. Let’s just say, I’m not interesting in the fruity connector instead of what the rest of the world is doing 😛.

Modular Computer: iPad Pro as a Tablet, Laptop, and Desktop Workstation BY FEDERICO VITICCI.

Most websites covering news about software and tech for nerds, are pretty meh at best. One of the reasons I like dropping by Mac Stories is because it takes a short at doing things decently. It’s probably the only one focused on Apple that I don’t roll my eyes at, but then again I’m not part of various the fruit cults.

Also Federico is probably a worse tablet whore than I am, and after nearly a decade of using tablets: I don’t personally know anyone who uses their tablet more than I do, lol. Thus, I am more likely to find interesting stuff on Mac Stories.

Coming from the land of Android tablets, the bane of my modularity has long been the laptop issue. Handling tablet and desk mode has been straight forward for the most part, but accessories pretty much suck unless you buy some form of iPad. Meanwhile pretty much everyone seems to make something for the iPad with a keyboard to go.

iPadOS bringing a desktop style mouse experience, and discarding the Android like one, makes me more tempted to try docking Nerine instead of relying on Stark and Centauri for desk duty. Much as its many Android forbears have over the past decade.

The iPad Awkwardly Turns 10

Apple is a lot of things, some good, some bad; consistent is not one of those things.

I actually used to have a fairly high opinion of Apple’s design skill, until the first time I tried to help an iPhone user. That was somewhere around the 3GS or iPhone 4. At which point I wondered how the fuck anyone could use the things.

Over in the land of PCs and Macs, I kind of recognize that many oddities exist. A great many are also artifacts leftover from a time where Apple or Microsoft did a thing, and were probably the first to really do it, rather than following on the trail of standards and successful giants. But that feeling never has repaired my opinion of the fruit company’s software. Today is also a much more connected world than the ‘90s and ‘80s were.

Apple actually does make some great stuff, and folks that helped create those products and experiences should be proud of their work. But like anything else with ten trillion moving parts, consistency kind of goes out the window quite rapidly.

I will admit though:

How would anyone ever figure out how to split-screen multitask on the iPad if they didn’t already know how to do it?

Is the kind of thing, that lead me to start making jokes about having to swipe friend in Elvish.

The iPad has developed a pretty nice on boarding experience, give or take four hundred privacy notices, and the user guide in the Safari default bookmarks is well worth giving anyone that has never used an iPad before. But there is definitely IMHO, a trend towards learning to swipe and gesture in elvish.

The iPad at 10: A New Product Category Defined by Apps

As someone that’s come to rely on tablets heavily, despite avoiding the fruit company for much of the past decade, I kind of like the notion of tablets as a middle category—because that’s where most people’s computing lands.

A long time ago, I preferred laptops to desktops for the portability. Today, I don’t really believe in desktops so much for two main reasons: laptops aren’t as underpowered as they used to be, and rack mounted servers pwn most towers if you’re really going for raw compute power.

Tablets kind of answer the ability to do most of what regular people do with their computer. But aren’t so tied to the concept of a monitor, mouse, keyboard, and tower; laptops both suffer and benefit from rubbing the same software. And tablets would mostly suck for running the same software as desktops, far more than it would from adding a mouse and monitor to today’s tablets.

For better or worse: software often defines are interactions with devices. Think otherwise? Try using Windows 95 without a mouse or keyboard 🤣