Old regexese mastery revisted

For keeping my mind off things, and passing time while more up to date ports compile, I’ve spent some time working on neo (a replacement for my private updater.sh; itself a replacement for portmaster/portupgrade). When I had last left off, I was trying to integrate code lifted fro NPMs pkg_info wrapper, used for splitting canonical package names into the packages name and version numbers; when I noticed comically that the old kludge didn’t quite hold water. Kind of ironic, considering it was one of the most throughly tested snippets of code I have ever written lol. The principal problem was making it deal with names like libsigc++, pcre++, and libxml++—which contain regex meta characters (the +’s) and dealing with ()escapes breaking the algorithm.

So I just sat down and rewrote it, *properly* this time around, and it works like a blasted charm, in half the amount of code and a fraction of the execution time to boot. The only downside is between development and testing, I’ve spent several hours on it, and I’ve got to get up for work early tomorrow :-(. But hey, it’s not like I could have gotten a decent nights sleep anyway…. and I would rather have something like this catch my focus, then leave my train of thought to it’s usual ends.

neo is basically meant to replace updater.sh, but at the moment, I’m focused on replacing portversion—to make it worth using. Basically, I gave up on portupgrade and portmaster, and wrote a shell script that is more to my taste (updater.sh) some months ago. Using it is a simple thing:

# portversion -o | grep '<' | awk '{ print $1 }' > /tmp/pu && vi /tmp/pu
# /home/Terry/updater.sh /tmp/pu

leaving me to manually select and sort the list before starting the upgrade. Now I could easily insert filters to handle that kind of stuff in a highly flexible way, but if we’re gonna do that…. may as well ditch the portversion dependency, and that means ditching /bin/sh :'(. Making the current dilemma, converting the wacko zacko formats of version numbers, into something comparable in order to tell if it’s up to date or not. Technically, a straight compare could be done, since it obviously isn’t up to date unless it matches the version saved in the INDEX, but that precludes handling common edge cases.

That was a freaky moment… was sitting here writing a post to a mailing list over ggrps when the light/tv started to fade in and out. Did a quick ^A^C in the hopes that I might get the text copied to qlipboards file store, and flushed to disk before the system could fail…. no such look, an hours of typing gone: and I’m not rewriting it.

The strange thing, is it wasn’t a normal power outage, but more of a fluctuation. Part of the building went dead but part remained, on extremely low power, and I’m talking like a lamp making like it’s been given a 10 watt bulb lol. The best way I can describe it, is that it is as if the available power was tuned down waaaayyy low, and anything not able to compete for enough power started to shut off.

What is this, Republic of Turkey or something?

EPI rolls on

It’s been a bit of a busy hectic, if slowly progressing day :-S. Most of my time has been a juggle between being driven crackers, and trying to actually get stuff done. I’ve made about 9 commits to EPI and created more wiki content, so at least it’s been a viable use of time.

Since I’m more familiar with git then any of my colleagues, I’ve been focusing on the git side of wiki content for now. Source code management / version control technology is pretty simple to use, and although I can’t speak for the early days, modern git (1.5+) is down right easy lol. Well, it sure makes me happier then Subversion ever did >_>. Most of my experience with git, has been as sole developer, and should we say, without having to worry about patches, so I’m more intent on having this kind of stuff documented in our wiki then I normally would be about such a matter.

Also did experiments in writing the program documentation in DocBook XML. So far so good, the manual page outputs from docbook-xsl is pretty damn good, and the HTML will do perfectly enough I think, when mated to a little CSS, hehe. I’m not sure what we will be using for writing our documentation  beyond the Wikiverse, but the best options are DocBook, Plain Old Documentation, and GNU TexInfo. Personally I prefer POD but DocBook and TexInfo are more appropriate. I shared the DocBook manual I did this afternoon, along with links to the documentation for the GNU Compiler Collection, to serve as examples of what the two systems look like. If we need any custom XSL files, I can probably whip them up; already have rudimentary stylesheets to convert a book to forum and blog posts ;).

For the most part, I’ve been rather melancholy mood of late, so being able to focus on EPI has helped me a lot today. I’m usually happier when I’m able to ‘geek out’ then when I’m assaulted and bored…

A little $HOME, NPM, and EPI fun

Been cleaning up my home directory a bit, particularly ~/Projects and the dumping ground in ~/2besorted lol. I’ve managed to find a lot of old notes, even have imported a couple saved items into del.icio.us. While I’ve mostly dropped all my bookmarks, there are still a few files with old bookmarks in them. I merged them, using the old `cat files | sort | uniq > newfile` method ;). Sometime I’ve gotta shift through them and see what’s worth keeping. Since ma.gnolia’s great lossage, I’ve still have been maintaining the habbit of using Google and my brains limited RAM as a substitute for bookmarking >_>.

I’ve also taken some time to sort out the change over from Subversion to Git on NPMs source forge page. Last night, I started work in converting my updater.sh to Python… figured to dub it ‘neo’ and incorporate it as part of NPM. Hey, I’ve always wanted to remove portupgrade from the picture… lol. The neo script is basically meant to be comparable to portmaster or portupgrade, and whatever century time brings me back to Neo Ports Manager, will likely serve as the backend. However, I must admit, unlike NPM, this part is somewhat of a selfish agenda at heart: it will be able to do more then my updater.sh can without help.

Finally found the time to add a few pages to the wiki being setup with the Encapsulated Package Installer project. The stuff there is still mostly bare, because I’ve been focused else where’s the past couple of days. Mostly the content added was centred around development tasks, going to have to settle down and plan out what changes need to be made. All in all, things have been moving foreward. When it comes to my personal projects, EPI has the lead seat, and the rest are enqueued, in terms of programming.

That being said, of course I have my strokes of working on other things when shorter periods of time are free: the odds and ends, hehe.

This Journal Moved To Blogger

Well, I’ve still got a years worth of posts to transfer (groan!) but since I’ve been updating my journal at blogger consistently since the last post here on LJ, and half the people that follow my journal periodically, have probably updated themselves… lol.

This is the official transferring post!

I have moved to http://spidey01.blogspot.com/ and will be updated my journal there, leaving this Live Journal setup to be as a matter of posterity. LJ has a policy of leaving inactive accounts lay, so I expect this page will survive longer then most references to it. All future updates are going to my page on Blogger. If anyone actually tracks me through bookmarks or Atom/RSS feeds, time to update.

Few people if anyone read my blog, so it’s not much trouble, beyond updating my forum signatures lol.

Ahh, a good day…

Spent most of the work time, thinking about todays tasks. I really did not expect the new continuation training sessions to go over very well, just an inkling. Almost had a full server, but lost a few due to connectivity issues and ~20 minutes delay in trying to get as many bodies as possible. But hey, if we had started on time, there would only have been two participants to scare off  >_>.

Things went pretty well, tried to keep the live fire exercises fairly easy since it was the first session. Ezbassr and company proved to be more then up to it, much to my relief lol. The purpose of continuation training, is to give you a rough and tumble environment to practice your skills. It’s not a piece of cake, you’ve gotta be sharp, or you’ll fall flat on your face. Ah, the joys of [SAS] training :-D. We train harder so we can fight Eazy ;).

Highway 20 ride

I ride east every other Friday
But if I had it my way
A day would not be wasted on this drive
And I want so bad to hold you
Son, there’s things I haven’t told you
Your mom and me couldn’t get along

So I drive and I think about my life
And wonder why that I slowly die inside
Every time I turn that truck around
Right at the Georgia line
And I count the days
And the miles back home to you
On that Highway 20 ride

A day might come you’ll realize
That if you see through my eyes
There was no other way to work it out
And a part of you might hate me
But son, please don’t mistake me
For a man that didn’t care at all

And I drive and I think about my life
And wonder why that I slowly die inside
Every time I turn that truck around
Right at the Georgia line
And I count the days
And the miles back home to you
On that Highway 20 ride

So when you drive
And the years go flying by
I hope you smile
If I ever cross your mind
It was the pleasure of my life
And I cherished every time
And my whole world
It begins and ends with you
On that Highway 20 ride….

— Highway 20 Ride, Zac Brown Band

Oh how I love git, let me count the ways!

Initially I kept separate repositories for each portion, most notably the EPI core and build system. Since Trac was selected for part of our web stack for getting stuff done, and it can be a tad pissy about multiple repositories, I’ve opted to create a “Merged” repository from the others. Essentially the Trac will require multiple Trac environments or a singular repository; while my we take the time to decide which to use, I just whip up a solution out of my hat like a good little geek &(^_^)&.

The trees were located in dixie:~/Projects/EPI/repo/ and still are. After backing up the repositories, I created a new ‘work’ tree to become the home of this merger, threw them together, and did some clean up. First I tried using git filter-branch and git format-patch together with a few other mungies to get the history retained how I wanted it, then I decided to screw it and just make branches reflect the history—even better then what I wanted.

I then used git format-patch to create patch sets, placing them in temporary directories. Rather then change the patch sets to reflect the merge (good task for perl scripting), I decided to rely on git mv for something more full proof then hacking patch files by hashed out automata.

Creating a new ‘work’ repository, I made an initial commit with a stub file, then created suitable branches for each in my original repos, which is a task easily automated in sh or Perl, for people with lots of branches. A little bit of git checkout and git am, then slurped up the patch sets bringing each repository (and it’s associated branches) under one roof.

Creating the new merged ‘master’ was a simple octopus merge.

$ git checkout master
$ git merge repo1 repo2 ...
$ git mv ...
$ git commit -a

Job done, good night!

Note also, I wanted the trees merged, so conflicts were not even present, hehe.

Spent mos of the day working on the [SAS] Dept. of Agriculture map… been in a terrible mood all day. That’s just the way things have been rolling here of late :-(.

Sorted out the details at the outside spawn point, adjusted the fire escapes accordingly, but had to remove both the window frames and glass in order to move through it. After shooting out the planes of glass, there’s to much stuff left, and it blocks you from climbing through lol.

Got a brain fart, and added extra rooms/doors connecting that spawn poin to the basement garage; just need to sort out the nodes properly before that is finished.

Began work on redoing the central office upstairs, only to find out that I can’t look up the same brick wall paper used all over the blasted map, so I’ve had to substitute a more fierce kind of brick. Gonna sort out that office space and rebuild the bombed out bathrooms below, then the structural changes will mostly be complete.

I have to do the AI related work and tidy up the marketing office area, but otherwise the map is basically done, it’ll be ready for a little testing soon, hehe.

Full throttle

So far so good, managed to burst through my weekend todo’s hehe. Completed my work for [SAS], got time to play a couple games, did the setup for ticket and documentation management needs on a personal project. Plus I’ve been aggressively moving ahead with the [SAS] Dept. of Agriculture map for SWAT 4.

All this while upgrading my laptop >_>.

Remaining to do with The DoA, is setting up the access from the outside spawn on level 1, to the window breaching point up on level 3. I’ve completed the relighting and half the general cleanup of the building, so the only big task: redoing the AI. I plan on setting up an interesting little “Scenario”, hehe. If I could, pushable barricades would be a nice touch but I’ve yet to figure that out in SwatEd.

Most of my time has been spent on the Encapsulated Package Installer (EPI) project. We now have a forum, issue tracker, and wiki system going. Things are not moving as quickly as I would like with EPI, but the past two weeks have advanced rapidly on infrastructural issues. When our new systems are more established, I’ll likely make references to it here.

Overall, the highlight of my weekend, has been (at long last) getting my invitation request to Google Wave accepted. So far it seems to be coming along well, but I have almost no one to wave with at the moment lol. The ability to invite 8 others came along with it; most of which I’ve mentally reserved for a few close friends and a couple for good uses. The main downside of Wave, is simply that e-mail and traditional IM systems have been around a really fraking long time; so it’s an uphill battle. One that I expect the world is not quite ready for yet, although it is very much a technology in the direction of our Internet future.

I’ve even found the time to migrate most of my gaming related pictures on to the WWW! The old home directory is getting thinner ^_^. Another thing I’ve gotten down, is transferring Live Journal entries from November 2008 to this Blog. I really need to pick up the pace on that. As much bother as it is, one thing I really do like about this migration procedure, is it gives me the ability to organise my old entries using lables; walks me down memory lane, and helps me to find ‘useful’ posts that were forgotten.

Life is busy, the serious issue is advancing forward quick enough for my tastes… rather then being bogged down and omni-tasked to death. My family is good at doing that.