Something about sorting through receipts made me vaguely think that I need a filing cabinet and I should likely print out my digital receipts where applicable. Then I remembered I hate paper, and I mean I really hate paper. That is when it hit me.

I don’t need no freaking file cabinet, I can just store files in a secure dropbox and be done with it. Plus it’s possible to effectively “Scan” papers off my phone by using the camera and uploading them. Victory is mine on the war against wasting valuable trees  :-).

Now the real question is with the sorry state of our traditionally decrepit power grid, what does more damage to the earth lol—more paper or more PC usage.

How to tell if your phone is female, or if Linux is smarter than Windows XP

O.K. so my phone has been giving me a message every time I plug it in, asking whether I want to turn on USB mass storage for mounting or charge her off the USB. So today I plugged it into my desktop, and I’m scratching my head trying to figure out where I turn it on because it just starts charging.

You know that your phone is female, if you plug her in and she starts to CHARGE instead of asking you to MOUNT her >_>.

Conversely you know that Linux must be smarter than XP, when the phone does ask about mounting if plugged into a Linux laptop instead of a Windows XP machine <_<.

Yeah, lol.

I’ve been experimenting with window managers lately: fluxbox, openbox, awesome, and musca. Fluxbox and openbox, are pretty much just generic window managers, at least in my eyes. That said, they are well worth using, for most peopel. Awesome and Musca are tiling window managers, and a lot more, eh, minimalist. While I used to collect window managers, among quite a few other odds and ends back when I had time for it: but I have never done the “Tiling thing” beyond a very brief test drive of dwm.

Awesome and Musca create an interesting experience: you create the windows, the window manager, manages them. It’s almost alien lol. Normally you create a window, the window manager figures out where to draw it. You do the rest, e.g. by resizing and moving it around as necessary. In these tiling window managers however, newly created windows are automatically arranged and sized by dividing containers.

Launching your first window is like maximising, lauching a second window causes everything to resize and give each program half the screen, and so on based on some tiling pattern. The most used seems to be 1 half sized window left and 2 quarter sized windows right; works better than you might think. Rather than resizing the windows individual, you resize the containers. So if the screen is laid out as:



|         | term |
| Firefox |------|
|         | chat |


>

Selecting either the term or chat window and attempting to resize will resize all three windows. Try to enlarge the chat window horizontally, and Firefox will shrink and term grow, horizontally. Try to shrink the term window vertically and the chat window grows vertically, and so on.

It’s mind blowingly better than what the style of window management people are used to these days, which dates back to like Mac OS 2 or Mac OS 3 back in the ’80s. It is also a little bit awkward to let the computer take care of something, that you’ve been doing by hand for almost twenty years!

Relishing the experience however, has made me think of something different. I was just experimenting with the MinOverlapPlacementPenalties and MinOverlapPercentPlacementPenalties settings in FVWM, and it hit me. What if you could dynamically define what windows are important? I.e. what screen space should have more “Don’t cover this up unless necessary”, and how big a frame (i.e. for auto-tiling) should be, and so on?

It is technically possible, if perhaps computationally ‘interesting’ to figure out at the machine level. The windows that spend the most time focused or are most often gaining the focus, would be prime candidates. If the user ‘uses’ the window more than others, give it a larger chunk of available space scaled to its idea of how much space it needs, then prefer minimising the percentage of those windows being covered over or shrunken to absorb other windows in the same screen space.

It is food for thought!

Since for (network) testing purposes I’ve rigged a spare partition on my desktop as a virtual duplicate of my laptop, but obviously s/ati/nv/ and Linux is smart enough to take care of the rest. To make the most of it, I also swapped a few things around to the latest packages. For dependency reasons Gnome is installed; like wise KDE for old times sake and Xfce for completeness. I decided that since I needed a desktop session to test the ‘common’ web browsers, that I may as well take KDE for a spin.

So far, I’ve tried about four or five versions of KDE since 4 went public, all but one of them was a release version. Taking a count from the moment the KDM wall paper is replaced by my blanked Xfce one, my meagre laptop loads Xfce into a usable state by “The count of three”, and has Dropbox and some applets loaded by five seconds. By contrast, on my waaay more powerful desktop, not only did I give up counting at the second mark of the startup splash screen: I dropped my water bottle and had to fish around in the dark to retrieve it from under the table. By then, KDE still had not gotten half way through it’s start up splash srcreen 8=). I like KDE, I’ve even used some versions of 3.4 on a piddly 500Mhz system once upon a time. But KDE 4 is just slow, freaking slow!

However, I must admit that KDE offers a very pleasant and polished visual appearance. Its like looking at a sleek sports car, only better. Their new desktop metaphor as it were, is likely a grand improvement over the traditional desktop. Compared to wrapping ones noodle around Deskmate or living with the UI that has plagued Windows for the last 15, if not 25 years, it is also argubly easier to use. No doubt about it, a first load of KDE is a hell of a lot more straight forward than a straight load of modern (or classic) Windows.

While it’s all so well done now, and as much as I remember enjoying KDE(3), if KDE(4) is the way the future desktop will be, me thinks that I will be continuing to use a keyboard and terminal emulator more often than a mouse, keyboard, and GUI applications!

It’s nice stuff, but hell, if it’s going to be that slow, why even use more than an xterm?

Hmm I must admit that custom configuring a Linux kernel, seems to offer three possibilities:

  • Lean, mean, and sexy kernel build
  • More modules than you can shake a stick at
  • Major headaches

I’m tempted to configure for a balance between the first and second, it is an interesting idea though. If I tuned a kernel build for my very specific system, it would strip out most of the usual bloat. The downside is there are so many configuration options, that making the config might take longer than compiling Linux!

Oh freaking vey, what a cycle!

Sometime ago, installing KDE rather fouled up the gnome session on my laptop. That was the first strike against Ubuntu package management. Well the other day, I was adding a few more development packages, and trying to think of what kind of minimalist tiling window manager I would like to try. The only real reason I’ve been using gnome the past few months, is that’s the default and the system kind of centred around it o/. In working on a list of what window managers I wanted to test out, I decided that I would like to install dmenu first. So I installed dwm-tools to get it, using synaptic (I find it easier to use the GUI for searching for available dpkg’s).

Well, sure enough on reboot things were FUBAR. GDM unable to log into anything, XDM bumfucked, and using KDM to launch a Gnome session resulted in a barely functioning one, just like before. KDE however worked perfectly, and I also have come to see KDE4.4 as the slowest pile of software in the Linux world >_>. That’s the only bad thing I currently have to say against it. Reinstalling GDM, Gnome, and related packages didn’t help matters any. So I bid farwell to Ubuntu once and for all, and I’m not going to say hello to Debian for a while either.

I’ve always used Slackware or Debian ‘esque systems, when I’m stuck using or desiring to use a GNU/Linux distribution. People have reccomended Arch and Gentoo, and I’ve meant to experiment with Source Mage and Arch for a while. However, I don’t have time to fuck around, and Debian dpkg or Fedora rpm level compatability is desirable. So I flicked a wild switch and decided to try something a bit more red headed.

Enter CentOS 5.5! While certainly a fine Linux distribution, and its yum tool proving much more, pleasurable than manually invoking rpm. There were numerous problems. Most of the packages in CentOS, even after using RPMForge and EPEL (a community supported mirror of newer packages for RHEL)—most of the packages in CentOS were ancient. The youngest of my development packages was slightly younger than my laptop, and most just so old that it’s distasteful. That would mean, to get any *real* work done, I would have to forsake yum and install/manage my software manually from source. Joy, why didn’t I just slack off? That however wasn’t a show stopper. It was getting the blobs I rely on to function on top of that, that seriously broke the deal. I gave up trying to get Chrome working. There were also problems getting the default gnome desktop to work, but I wasn’t planning to use anything heavier than blackbox anyway.

CentOS has earned my respect among Linux distributions, and I like the system a lot. I just can’t rely on it for my personal work station :'(. For regular desktop and laptop users, CentOS is probably a great idea though. I’m not a regular user by any means.

So after that, I started relying on the only thing left I could trust: my own head. Using a mixture of CentOS on my laptop, a USB stick, and NFS mounting the work dir’ on the desktop (faster processor), I set to work. Building scripts to fetch and build Linux and the usual GNU packages. My own Linux distro. Trying to get things to actually fucking build was a bit of a different story. Remind me to never rely on chroots in Linux.

Since plans C and D popped a cork, I quickly zipped up my work and saved it to the flash drive. Then archived my home directory over SSH. Googled for Slackware’s latest release and searched ye ol’ wikipedia for related distributions. I know of several but have never used anything more slackware, except for a very brief test run of KateOS. Among a quick grep of distros related to Slackware, one that stood out was Zenwalk.

Plan B, as everyone knows, is make it up as you go along. Or at least, out of my ever present plans A through D, that’s my plan B ;).

So I have setup Zenwalk Core 6.4—they have several distributions. Unlike the complete Xfce based system of Standard, Zenwalks Core distribution has a rather minimal but complete base of packages. All without the headache of selecting what to install in slackwares installer lol. Core is a command line install, X isn’t included. That is my kind of system, hehe. There are a few helper tools but for the most part, I prefer to work directly in /etc when possible. Being based on Slackware, of course Zenwalk Core doesn’t feel alien in this department. The Debian/Red Hat based systems tend to be more confusing then need be, where as the BSD systems usually forgo run levels in favour of traditional unix Simplicity. In slack country, a happy median is found.

The main point of interest here, is package management: zenwalk uses a shell script called netpkg to manage things. I really is a crude form of pkg_add/apt-get but it gets the job done. It’s not perfect and has it’s qurks, for example netpkg foo will interactively ask you if you wish to install/reinstall each package matching ‘foo’ along with a yes/no to installing each missing dependency; where as netpkg install full-foo-pkg-name.txz will install foo, omitting dependencies. It gets the job done.

It’s the slackware compatability that I like about it though, namely the ability to rip apart RPM packages into Slackware tarballs and hand sort the dependency (netpkg can do some dependency work). Installing dropbox was a cake walk, just rip open the RPM and install it as a vanilla slackware package.

The problem is the network, sigh. The reason I hate Linux from a user perspective, is YMMV quite a lot between Linux distributions. Never mind that most distributions use the same software. In my case, the problem seems focused totally on my wireless card. I’m also to tired to go into it right now.

But to suffice it to say, I am still alive <_<

Laptop, sweet laptop

I never thought I could miss such a filthly old computer so much! Today in the snail mail, the replacement AC adaptor I ordered finally arrived, it only took about two weeks. The original one racked up to much damage right near the plug at the laptop end. The thought of slicing it apart where the damgae was and splicing it together, all well in good… until noticing that end is a frigging coax, which works a tad differently than a simple set of copper wire o/.

After a few days of using my desktops rat “Tuned up” for Quake / COD, and two weeks without using my laptop, it even feels a bit alien to be using a touch pad again. On the upside however, because of how much time I spend typing, contrasted to where I spend it: I feel PERFECTLY AT HOME using my LAPTOP KEYBOARD!!!

I’m sorry to say, that although I love the Model M style kb, I love my laptop even more, it’s ingrained in my muscle memory lol.

Perhaps, how not to do Open Source Software!

In grepping my feeds, I noticed something interesting about Diaspora finally hitting some recent publicity since my last check in. For those not in the know, Diaspora is meant to be a replacement for Facebook. The idea is good, basically take everything you could bitch about privacy issues on FB, fix it and roll it out with a distributed system that gives you as much control as possible over your stuff. I agree with Mark Zuckerberg (the FB guy) that Diaspora is a cool idea, really it is because of the decentralisation.

From what I’ve been reading about their first code drop however, I must say that it does not seem to be off to a very good start. A system that, to my knowledge touts privacy and security (compared to Facebook) as one of it’s strong points, obviously should not premier with more leaky holes than a Windows 98 demonstration. As much as I would like to crack a joke at that old relic, I can’t help but think how well Bill Gates took that incident in public. Old farts and fellow history lovers might see the inner-humour in that comparison. (Yes I used ’98.)

Being able to get a lot more eyes on target and the freedom in which fixes may flow, is one virtue of open source development, especially if you have enough people with a vested “Interest” in the projects outcome. There are many people who would like to see something like Diaspora succeed, and among them surely, more than a couple people willing to contribute aid towards that end. In a closed source environment, problems like that found in Diaspora would have only been findable by playing around with the release, and consequentially only fixed by the original developers a long time after attacks went wild. Like wise investors would be a different sort. Yes, even power users do glance at how their software works, let along crackers. Of those who really are looking closely, most are probably the dregs of the Internet or paid for the job, and either way it would be bad to bank business on the kindness of others. To my knowledge the only profit in finding exploits, is what you can slurp out of saps before it gets patched.

In the first article I checked out, some of the (now fixed) defects highlighted from Disporas code base were just blaringly, “WHAT THE FUCK WERE YOU THINKING!?” kind of problems. In the least, several of them are on my heads internal list of “No, No” to check before wrapping up. It makes me think the masterminds behind implementing the thing, were woefully unprepared for the task: web development is no easy task—and it is best if you take an anal approach to security early on, in my honest opinion.

The thing that irks me however, is who should be fixing those kind of things? Most of what I’ve seen highlighted should have been fixed before the code even left the developers workstation, if you go by my coding ethic. That gives my mind a moment to think about student-programmers, but this isn’t a rant; yet. Any way you slice it however, it is no the whole wide world of Open Sources job to be fixing everybody else’s code! Before you put your name on it, geeze, make sure it smells like a roll before you get rolled. I don’t mean to say anything against the developers… but this is looking like the start of an epic failure. Sadly.

Fresh USB

While standing in the checkout line in Big Lots today, something caught my eye: $9.50 each for HP branded 4GB USB flash drives. Now I’m a stingy son of a bitch if there ever was one about luxury items (and when you have a network, I call UFDs a luxury), but that was a nice deal.

The last time I bought a UFD, I paid something like six times that much for a little 512MB stick at Circuit City, that’s 1/8th the capacity of what I just got for ten bucks lol. Somethings are just to good to pass up :-/. Hopefully unlike my previous one, it won’t end up getting destroyed…. or becoming the dogs new chew toy.

Misc thoughts of a lazy programmer

Since opening a chequeing account, I’ve maintained a simple text file in my home directory, that takes the format:

YYYY-MM-DD +nn.nn // comment about this credit
YYYY-MM-DD -n.nn // comment about this debit

where the entries track the actual flow of money, not what’s in the chequeing account and what’s on my person at any given time. That’s the banks and my job ;p. Basically from this file, I know when I’ve spent or received money, how much, and approximately why. A quick compile of that against my bank balance and petty cash results in an error check: if the sums don’t match, something didn’t make it into my records! Since the file provides an obvious backtrace, it’s trivial to check things against memory/receipts for anything that’s been missed.

Now being a lazy S.O.B. why should I manually write out the entries in vim and do the math in CL? Well, arguably I could just implement a program to do all this for me :-).

Some people utilise spreadsheets, which is more work than it needs to be, until they become more advanced or more thoroughly programmable, not to mention. Namely I’m to lazy to use things like =DATE() and argue locales when I could use something else. There is also a ton of financial software in the world, but reading the manual tends to take longer than writing it, and many are overkill.

My thoughts? Either a small local program or web app would be perfecto. Finding one of the latter should be as easy as using Google or taking 20 minutes to write one. Decisions, decisions :-).

The thing I have against “Canned webapps” that some business runs, is the lack of control: it’s impossible to hack the code quite the way as something home brewed.