Well, at long last: my own build tool

In dealing yet again with compiling dependencies under Win32, and yet again having to deal with a yucky mixture of GNU Autotools, CMake, and IDE-x project files…. I’ve come back to thinking about an idea from several months ago, for a project called `yet another build system`; which led me to learning CMake.

The original (yabs) idea, was centred around the concept of XML descriptions of build tools, most namely the Visual C++ and GNU compilers. Which would have been mated with a simple `make` like list of rules, basically on par with the inference rules everyone knows, and loves (or hates).

For tmk however, the horizon has expanded and the XML is dropped lol. tmk is meant to be very much, a more portable make with interesting tuning options. Conceptually, it’s the same, but, eh fixed you could say. If you don’t understand that, just think back to using Makefiles and having to support 20 different build environments! Yeah, that kind of fixing.

The main point, is abstracting away the shoe honing measures. Make is perfect for the environment it was created for, building software on UNIX. However unix has diversified and fragmented sufficiently, that the average programmer is a useless stick in the mud for writing portable sh, supporting other environments like Windows, generally requires a separate set of Makefiles or carting along the (superior) unix environment with you, which is a terrible way of doing things. Fixing that problem is simple, it’s been done.

A novel feature, tmk will aim to complete builds as quickly as possible: by inferring from the rules how to build modules in parallel, which also provides me an excuse to write some code I’ve been meaning to write for EPI ;). This part is also easy.

The interesting part, will be handling dependencies properly. The real royal fuck-baked irk of most ‘later day make’ solutions, such as CMake or VCBuild, are that can’t handle dependencies worth a shit: and it makes building software off the original work station (windows) or operating system group (unix) a bitch and a half.

That’s the personal itch tmk has to fix 😉

Having a bit of time free, I’ve elected to work on my games resource sub system. Overall, the whole game is a lot like Id Tech 3 in architecture, because it’s exactly the kind of architecture I want, even if I don’t want to put up with the engine that powered Quake III: Arena so excellently.

For the resource loading, what I want is for a configuration file to define what packages should be loaded, kind of like a PATH for zip files. When something needs to be loaded, it’ll be be found at a path, like foo.zip:/what/file.ext— telling the sub system to locate foo.zip in the game or user data directory, then return a handle to the /what/file.ext stored inside of it. As opposed to the more Id approach, where file naming, e.g. pak0.pk3, pak1.pk3, … controls the loading order, I want my game to use a config file to handle that. Because personally, I think having to remember which machine friendly name a resource resides at, is a pain in the arse for making maps and mods >_>.

Because I’m running things very light on dependencies, mostly because compiling stuff on Windows is a bitch (stupid programmers) and many useful items are not easily available on FreeBSD. So things will probably be built on top of zlib, using contrib/minizip as an example of using zlib with the ZIP format.

It’s been an easier day then expected: ma wasn’t feeling up to working, so instead I ended up in the super market >_>. On the upside, financial woes aside, I got to spend more time focused on programming :-D.

Lately I’ve been working on the computer games I want to build, and finally came to a working title for `StarfighterGame` –> Stargella. With a sub title of vengeance or revenge, which is a perfect title for allowing sequals and expansion packs to be made ^_^. The concept is much like the classic arcade games: simple mindless action. I remember Galaxian and Galaga most fondly, as games that I played in front of Pizza hut as a child. The best free “Clone” that I’ve seen is xgalaga, but it leaves a lot to be desired. What I want, is the kind of game you sit there and play for hours, and suddenly realise you’ve been sitting in front of a computer for the last 6 weeks with a long beard and a torn house coat lol.

The story is simple, also like the great classics: you’ve returned home to the beautiful world of Freyja to discover it is being ravaged by the evil Viekasiekian Empire. Realising that you are the last able bodied star fighter pilot on Freyja, you dive into battle: pushing the Viekasiekian forces back off into space for the final showdown against Emperor Zurick.Game play obviously being in the overhead shooter style of Galaxian and Centipede, but unlike the classics, I want the ship to have full range of movement with the mouse cursor, instead of being constrained to say, left and right, ala space invaders.

Calling the planet Freyja, is meant to encourage the concept that the planet is a beautiful paradise. Like wise, the name of the lead baddy is named in hnour of the graveyard scene from Shakepeare’s Hamlet—Alas poor Yorick, I knew him well! The intention here, being to thoughts of death and such. After all the Viekasikians are meant to be evil ;). Geeze, try to say that three times fast without twisting your tongue in a loop!

Portability concerns have caused me to have to largely ‘drop’ python from the mixture, along with all high level solutions for general rendering and game development tasks. FreeBSD is a horrible platform for developing a video game on, IMHO, unless you like the low level stuff. So, I’m using a common denominator of fairly ANSI C / ISO C89 compliant code, with SDL as the principal backends for graphics and input handling.

It’s also kind of hilarious, trying to maintain a code base that supports both the GNU C++ and Microsoft Visual C++ compilers, using a *subset* of common C++, is actually painful. However using a fairly common dialect of standard’ish C in GCC/Visual C++ is quite less painful. This is principally because MSVCs concept of C programming is more or less 20 years behind the rest of the world 8=).

and if anyone else dares to fucking interrupt me while writing this post, you’re going to be flung out a damn fucking window!

One cheerful moment in a dreary day

The world abounds in aphorisms that convey wisdom to the young, although that advice is usually ignored. Many aphorisms are by unknown authors: “A stitch in time saves nine” (although anyone who has repaired a sail knows that one stitch can actually save 9,000). “People who live in glass houses shouldn’t throw stones.” And many witty people have contributed their own, like this gem from Mae West: “Lead me not into temptation; I can find the way myself.”

source

Old regexese mastery revisted

For keeping my mind off things, and passing time while more up to date ports compile, I’ve spent some time working on neo (a replacement for my private updater.sh; itself a replacement for portmaster/portupgrade). When I had last left off, I was trying to integrate code lifted fro NPMs pkg_info wrapper, used for splitting canonical package names into the packages name and version numbers; when I noticed comically that the old kludge didn’t quite hold water. Kind of ironic, considering it was one of the most throughly tested snippets of code I have ever written lol. The principal problem was making it deal with names like libsigc++, pcre++, and libxml++—which contain regex meta characters (the +’s) and dealing with ()escapes breaking the algorithm.

So I just sat down and rewrote it, *properly* this time around, and it works like a blasted charm, in half the amount of code and a fraction of the execution time to boot. The only downside is between development and testing, I’ve spent several hours on it, and I’ve got to get up for work early tomorrow :-(. But hey, it’s not like I could have gotten a decent nights sleep anyway…. and I would rather have something like this catch my focus, then leave my train of thought to it’s usual ends.

neo is basically meant to replace updater.sh, but at the moment, I’m focused on replacing portversion—to make it worth using. Basically, I gave up on portupgrade and portmaster, and wrote a shell script that is more to my taste (updater.sh) some months ago. Using it is a simple thing:

# portversion -o | grep '<' | awk '{ print $1 }' > /tmp/pu && vi /tmp/pu
# /home/Terry/updater.sh /tmp/pu

leaving me to manually select and sort the list before starting the upgrade. Now I could easily insert filters to handle that kind of stuff in a highly flexible way, but if we’re gonna do that…. may as well ditch the portversion dependency, and that means ditching /bin/sh :'(. Making the current dilemma, converting the wacko zacko formats of version numbers, into something comparable in order to tell if it’s up to date or not. Technically, a straight compare could be done, since it obviously isn’t up to date unless it matches the version saved in the INDEX, but that precludes handling common edge cases.

EPI rolls on

It’s been a bit of a busy hectic, if slowly progressing day :-S. Most of my time has been a juggle between being driven crackers, and trying to actually get stuff done. I’ve made about 9 commits to EPI and created more wiki content, so at least it’s been a viable use of time.

Since I’m more familiar with git then any of my colleagues, I’ve been focusing on the git side of wiki content for now. Source code management / version control technology is pretty simple to use, and although I can’t speak for the early days, modern git (1.5+) is down right easy lol. Well, it sure makes me happier then Subversion ever did >_>. Most of my experience with git, has been as sole developer, and should we say, without having to worry about patches, so I’m more intent on having this kind of stuff documented in our wiki then I normally would be about such a matter.

Also did experiments in writing the program documentation in DocBook XML. So far so good, the manual page outputs from docbook-xsl is pretty damn good, and the HTML will do perfectly enough I think, when mated to a little CSS, hehe. I’m not sure what we will be using for writing our documentation  beyond the Wikiverse, but the best options are DocBook, Plain Old Documentation, and GNU TexInfo. Personally I prefer POD but DocBook and TexInfo are more appropriate. I shared the DocBook manual I did this afternoon, along with links to the documentation for the GNU Compiler Collection, to serve as examples of what the two systems look like. If we need any custom XSL files, I can probably whip them up; already have rudimentary stylesheets to convert a book to forum and blog posts ;).

For the most part, I’ve been rather melancholy mood of late, so being able to focus on EPI has helped me a lot today. I’m usually happier when I’m able to ‘geek out’ then when I’m assaulted and bored…

A little $HOME, NPM, and EPI fun

Been cleaning up my home directory a bit, particularly ~/Projects and the dumping ground in ~/2besorted lol. I’ve managed to find a lot of old notes, even have imported a couple saved items into del.icio.us. While I’ve mostly dropped all my bookmarks, there are still a few files with old bookmarks in them. I merged them, using the old `cat files | sort | uniq > newfile` method ;). Sometime I’ve gotta shift through them and see what’s worth keeping. Since ma.gnolia’s great lossage, I’ve still have been maintaining the habbit of using Google and my brains limited RAM as a substitute for bookmarking >_>.

I’ve also taken some time to sort out the change over from Subversion to Git on NPMs source forge page. Last night, I started work in converting my updater.sh to Python… figured to dub it ‘neo’ and incorporate it as part of NPM. Hey, I’ve always wanted to remove portupgrade from the picture… lol. The neo script is basically meant to be comparable to portmaster or portupgrade, and whatever century time brings me back to Neo Ports Manager, will likely serve as the backend. However, I must admit, unlike NPM, this part is somewhat of a selfish agenda at heart: it will be able to do more then my updater.sh can without help.

Finally found the time to add a few pages to the wiki being setup with the Encapsulated Package Installer project. The stuff there is still mostly bare, because I’ve been focused else where’s the past couple of days. Mostly the content added was centred around development tasks, going to have to settle down and plan out what changes need to be made. All in all, things have been moving foreward. When it comes to my personal projects, EPI has the lead seat, and the rest are enqueued, in terms of programming.

That being said, of course I have my strokes of working on other things when shorter periods of time are free: the odds and ends, hehe.

Oh how I love git, let me count the ways!

Initially I kept separate repositories for each portion, most notably the EPI core and build system. Since Trac was selected for part of our web stack for getting stuff done, and it can be a tad pissy about multiple repositories, I’ve opted to create a “Merged” repository from the others. Essentially the Trac will require multiple Trac environments or a singular repository; while my we take the time to decide which to use, I just whip up a solution out of my hat like a good little geek &(^_^)&.

The trees were located in dixie:~/Projects/EPI/repo/ and still are. After backing up the repositories, I created a new ‘work’ tree to become the home of this merger, threw them together, and did some clean up. First I tried using git filter-branch and git format-patch together with a few other mungies to get the history retained how I wanted it, then I decided to screw it and just make branches reflect the history—even better then what I wanted.

I then used git format-patch to create patch sets, placing them in temporary directories. Rather then change the patch sets to reflect the merge (good task for perl scripting), I decided to rely on git mv for something more full proof then hacking patch files by hashed out automata.

Creating a new ‘work’ repository, I made an initial commit with a stub file, then created suitable branches for each in my original repos, which is a task easily automated in sh or Perl, for people with lots of branches. A little bit of git checkout and git am, then slurped up the patch sets bringing each repository (and it’s associated branches) under one roof.

Creating the new merged ‘master’ was a simple octopus merge.

$ git checkout master
$ git merge repo1 repo2 ...
$ git mv ...
$ git commit -a

Job done, good night!

Note also, I wanted the trees merged, so conflicts were not even present, hehe.

Full throttle

So far so good, managed to burst through my weekend todo’s hehe. Completed my work for [SAS], got time to play a couple games, did the setup for ticket and documentation management needs on a personal project. Plus I’ve been aggressively moving ahead with the [SAS] Dept. of Agriculture map for SWAT 4.

All this while upgrading my laptop >_>.

Remaining to do with The DoA, is setting up the access from the outside spawn on level 1, to the window breaching point up on level 3. I’ve completed the relighting and half the general cleanup of the building, so the only big task: redoing the AI. I plan on setting up an interesting little “Scenario”, hehe. If I could, pushable barricades would be a nice touch but I’ve yet to figure that out in SwatEd.

Most of my time has been spent on the Encapsulated Package Installer (EPI) project. We now have a forum, issue tracker, and wiki system going. Things are not moving as quickly as I would like with EPI, but the past two weeks have advanced rapidly on infrastructural issues. When our new systems are more established, I’ll likely make references to it here.

Overall, the highlight of my weekend, has been (at long last) getting my invitation request to Google Wave accepted. So far it seems to be coming along well, but I have almost no one to wave with at the moment lol. The ability to invite 8 others came along with it; most of which I’ve mentally reserved for a few close friends and a couple for good uses. The main downside of Wave, is simply that e-mail and traditional IM systems have been around a really fraking long time; so it’s an uphill battle. One that I expect the world is not quite ready for yet, although it is very much a technology in the direction of our Internet future.

I’ve even found the time to migrate most of my gaming related pictures on to the WWW! The old home directory is getting thinner ^_^. Another thing I’ve gotten down, is transferring Live Journal entries from November 2008 to this Blog. I really need to pick up the pace on that. As much bother as it is, one thing I really do like about this migration procedure, is it gives me the ability to organise my old entries using lables; walks me down memory lane, and helps me to find ‘useful’ posts that were forgotten.

Life is busy, the serious issue is advancing forward quick enough for my tastes… rather then being bogged down and omni-tasked to death. My family is good at doing that.

A little fun with git: publically exporting your local repositories

One of the great things about git, is it’s distributed nature; in my humble opinion, being able to tell your partners to pull your latest code is a useful stop gap for code review (without better tools… for now, lol), then having to e-mail the flubber as a tarball.

In my case, I maintain my working tree on Dixie, usually stored under ~/Projects/ some where. To prevent freak data loss, I also push things out to bare repositories stored on Vectra, under /srv/git. Those repo’s on Vectra are my “Centrals”, which will usually get pushed out somewhere else (e.g. SourceForge) if the projects public. The fact that my home directory on Dixie is also backed up is also a bonus hehe.

In order to setup a suitable means for people to clone, fetch, and pull from my git repositories, I edited my Routers configuration, and set up a NAT (Network Address Translation) to forward a suitable port to Vectra. In Vectra’s pf rulesets, I unblocked said port.

For write access, I use SSH and public key authentication to manage the repositories: and no one is permitted SSH access to any of my machines, unless they manage to break into my home wireless (or penetrate and suitably spoof my workstation), discover my username and hostname mappings, and brute force their way through the key pair before the internal firewalls tell you to F-off for good ;). In which case, good job monsieur or mademoiselle psychic!

Public read-only access may be setup with the humble git-daemon. Read-only access with controls, well is a task for something else ^_^.

The git daemon can be a fairly strict prickly pare about what it does export, so I feel reasonably comfortable with it. I created a simple whitelist file, called /srv/git/exports that describes what repositories may be exported: the file format is a simple line indicating the path to the repository to export publically, blank lines and those starting with a # comment being ignored.

I wrote a simple /etc/rc.git-daemon script that I can call from /etc/rc.local when OpenBSD starts, like so:

Terry@vectra$ cat /etc/rc.git-daemon                                            
#!/bin/sh

if [ "$1" = stop ]; then
logger -t GIT stopping git daemon
kill -9 $(cat /srv/git/git-daemon.pid) && rm /srv/git/git-daemon.pid && logger -t GIT git daemon stopped
else
logger -t GIT starting git daemon
echo "$(cat /srv/git/exports | grep -E -v '^$|^#' /srv/git/exports )" | xargs git daemon --user=nobody --group=git --pid-file=/srv/git/git-daemon.pid --verbose --detach --export-all --syslog --base-path=/srv/git && logger -t GIT git daemon started

echo -n ' git-daemon'
fi

After this is executed, it’s possible to:

$ git clone git://my.ip.addr.here/exported/repo/relative/to/base-path

as an extra bonus, since /srv/git uses my ‘git’ group for permissions but my umask by default tells everyone to screw off, I have to manually set permissions on repositories I wish to export, before someone can access them through the git-daemon.

Ok, so I’m nuts.