Lately things have just been insane, practically busy up to my eye balls. There’s always loads to get done, whether it’s pushed on my stack, or it’s something I would rather like to get done. The past week, I would say represents a new level of heavy multi-pronged focus on other matters, but that’s a truckload in of itself.

I wish someday, it would actually be possible to get some rest :-/ For some reason the concept of time to rest, brings forth a vision of a pine box… lol

A glance …

One paramount factor, is my mind constantly processes and evaluates multiple “Atoms” of data, for lack of a better word. Near simultaneously crunching away at relevant atoms, both processing new ones and refactoring existing ones in light of continued thinking, and supplementary atoms.

I don’t know how to explain it, without phrasing it much like a computer copes with multi-processing, or a significantly greater understanding of cognitive science then I possess, and I’m a programmer at heart, not a neurologist. If I knew someone with that kind of knowledge, I would probably ask whether the brains ability to handle things in parallel is built into our physiology, or just a learned ‘trick’ like multitasking on a uniprocessor system. (Ok, everyone’s asleep while my geeky brain drifts off, hehe.)

I refer to titbits of data as an “Atom”, because it is the most fitting word. While not to be confused with the meaning of atoms as understood in chemistry or physics, so much that cutting chunks of data into still smaller scalaresque elements would serve little purpose. The word stimuli might be more appropriate but hey, I’m not an English teacher.

Input sources for these so called atoms of data, can be virtually anything; ranging from what’s under ones nose, to past experiences, or established references. In thinking, my mind processes enormous amounts of data, proportionate of course, to the amount of data I may gather! Every atom is constantly evaluated, weighted in the context of it’s value and studied from a probabilistic perspective.

Over the course of my life, I’ve learned to be quite adapt at the gathering of information, in any topic or subject that I choose to pursue suitably. More then a few times, I’ve astounded people by quickly gathering an input set, and coming to a conclusion based on that data, in fairly short order compared to most of my peers. The ability to search for information should never be overrated, and controlling access to information can be crucial to manipulating an enemies reaction, but denying it and permitting its dissemination. That of course being said, I believe strongly in the freedom of information Growing up next to a set of mouldy encyclopaedias and a dusty double volume of Funk & Wagnails dictionary, it wasn’t very long after I learned how to read (well), that I would start learning how to gather and process resources of information efficiently. Libraries, technical manuals, even consulting more learned individuals than myself. Perhaps I should also be eternally grateful to Mage, for introducing me to a search engine that doesn’t suck! After gaining a measure of internet access around 1996-1997, I had found little use for the web beyond answering the quest for intel on any given topic of interest. That being said, I never came to make positive use of the Internet until closer to 2000.

Once I have information, I process it thoroughly, both in context and at many instances, off the wall as well. As I said previously, everything is weighted (it is also mentally tagged), helping me gauge the importance of any given atom towards arriving at a workable conclusion. Based on what I know, I also apply ample consideration for the likely hood of any meaning that may be possible. In terms of what may be possible, or the so called sample space, is something often run together in my head, and computed there as well, rather then done straight using numbers. Should we say, my mind operates on a level where the most relevant atoms are not necessarily easy to express as a number, yet may be readily thought of as if it were. Earlier this week, I told a friend that between 0 and 1, my grasp of the possibilities for most things I think at, often has the capacity to go beyond what the number of digits an average persons calculator can express to the right of a decimal point. The depth of my thoughts can often be compared to a typical person taking twenty years cracking at it… lol. Ok, so I think a freaking lot.

In building away at what I know, in order to put it to some applicable use, my reasoning is fairly varied, an art form if you will. I combine what may be inferred with what one may wager, and fairly carefully gauge the potential follow ons. Using both what I know and an an increasing understanding of what I don’t know, to arrive at the end of the line. One of my favourite jokes, goes something like this:

An astronomer, a physicist and a mathematician are on a train in Scotland. The astronomer looks out of the window, sees a black sheep standing in a field, and remarks, “How odd. Scottish sheep are black.” “No, no, no!” says the physicist. “Only some Scottish sheep are black.” The mathematician rolls his eyes at his companions’ muddled thinking and says, “In Scotland, there is at least one sheep, at least one side of which looks black.”

 You must always check your facts ^_^.

Most things I think about in depth, do not require very precise processing (someone I know, might call it a lack of rigour), only a disciplined and orderly analysis of atoms. Experience as a programmer as well as much time spent labouring over questions of “How does sth work”, has thought me much of that. Organised thinking is a very valuable asset, to any reasonable individual.

It can never be said enough however, that the sources of atoms most also be evaluated as much as the atoms themselves (fellow coders might call this a form of mental taint checking). Sources are important, check and discriminate them! One can learn much from a study of historiography, if the words primary source and secondary source mean nothing, you should probably repeat part of school… and try not to sleep through it this time 😛 (that or have your old history teacher fired). Many people who berate resources such as Wikipedia, often need to learn the value of checking multiple sources, and strongly taint checking the atoms that go through your brain.

When dealing with matters of intelligence6, it is even more important to consider the validity of ones sources, and it’s meaning. Perhaps the best examples of this, would involve a game of poker or a peek at a handful of agencies, but hey, go learn it yourself. I am very objective of the information I process, as well as in that which I promulgate. Some of the more shadowy figures who follow my journal every now and then, will no doubt have comprehended that, if they themselves possess any understanding of intelligence analysis.

In the course of my life, I’ve found digging into solid sources as well as concise overviews of any subject matter, also to be a worth while pair. I have never allowed anyone or anything to influence my opinion, so much as I search for and study the factual data, and what may be proven through it; and despite the marbles that sometimes rattle too and fro, my mind does not settle lightly without good reason.

Earlier I noted that all of this stuff, occurs fairly simultaneously. At any given time, my brain normally follows about 2 1/2 to 3 trains of thought, and regularly the impacts of mental operations upon one atom, will affect dozens of other atoms, not always related ones either for that matter. (I love phrasing things like that :-P.) Being able to follow trains of logic and reason while gathering more data, and to do so concurrently is inherently a valuable skill to learn. Perhaps I owe this one, to having to often (ab)use my computers capabilities at multi-tasking whenever I’m around, and thus my mind had to adapt to take advantage of it.

Practical experience in a matter, as well as taking into account evaluations from others also plays a roll in things. One interesting example, SAS has placed me so many times into planning and leading missions, I often have the perspicacity to tell when something appears sound coming from a younger Element Leader, but is really a load of marlarky. Such usually becomes obvious under scrutiny. Ones experiences however, can not override all other considerations, after all, just because you’ve never seen a winged horse in the sky, does not rule out the possibility that such a creature could [have] exist[ed] somewhere in the universe.

Poor it all in a mythical brew, and you have an interesting if verbose summery, expressing parallel application of deductive and inductive reasoning, contextual evaluation, systematic thinking, respect for probability, taint checking, experience, and more then a few techniques for analysis that I care not to mention publically. Growing tired, perhaps it might just be more succinct to explain my mental processes as a form of a deep magic that describes my minds fundamental functioning. Some other people still work on the spaghetti theory.

Oh wells :-/

Angels & Demons

Just got through watching Angels & Demons, basically filmed as a sequel to the The Da Vinci Code, which was actually the second book written. On the positive side, that means if you missed (or skipped) watching the code, you won’t have trouble following this film in the least.


It carries that rapid pace, oft’ times so difficult to achieve in writing, while creating a very thrilling mystery you just can’t help but watch. The ending is also one that (ofc) shakes the Vatican to its core, and as anyone whose ever watched some Star Trek is likely to recall, what happens when matter makes contact with antimatter is explosive stuff ^_^.




The only thing I can complain about, is that the film gives away the ending well in advance… if you are keen enough to be paying close attention in the first place. A bit of knowledge about psychology, as it impacts the kind of stuff we do over in SAS, also helps a sharp mind to combine inductive and deductive thinking, to narrow the possibilities down quite considerably. That being said, if you didn’t pay close attention, you might be shocked silly at the finale!




Hey, I’m a geek, my attention span has moments of hyper-focus on minute details all day long. Gotta love programming :-P.

Ugh, it’s been a long and unpleasant day! Never the less, I’ve almost got the MSVC builds sorted to where I want them. Basically why unix builds are shared libraries and windows builds are static libraries, has to do with the respective linkers.

At least on FreeBSD i386, the (GNU) linker doesn’t complain about the common and sys modules referencing one another, you could say it has more of a view that the shared lib is a chunk of code and all is fine as long as it all resolves enough in the end. I generally prefer dynamic linking over static, although I have nothing against static libraries internal to a project; when it comes to Windows  however, I’m particularly found of Microsoft’s SxS technology.

While the GNU stuff on my laptop is hapy enough to obey, the link tool provided by MSVC however, won’t cooperate with that model of behaviour for shared libs (DLLs), only static libraries. Other then the increasingly becoming stuff that belongs together, the common and sys modules were merged into a single ‘core’ module, and tonight, prepped to better handle compiler specifics as well. Secondary is that, simply put link makes shared libraries a bit more typing then need be. Every other sane OS/Compiler pair I’ve encountered, has the lovely habit of assuming that you wrote a function in a library, and might want to share it with other programs. Visual C++ on the other hand,  presents several ways of doing it: that all basically amount to telling the linker which things an application may slurp up from it. Basically resorting to writing a “.def” file, or in wrapping up function definitions with a __declspec(export) attributes, and the correct __declspec(export) or __declspec(import) attributes at their declarations.

Microsoft’s way of doing things is more flexible, but one might fairly argue that the inverse behavour (e.g. export anything not specially marked) would have been better.

Generally I like MSVC, I think it’s better then GCC, if you are willing to put up with the major lack of C99 compliance and lack of stdint.h (I use one written by Paul Hsieh). The main downside is the tools tend to be a bit, eh, stupider then GNU brew, and the best parts of the system are like wise fairly specific to both MSVC and Windows NT. Personally I would enjoy a professional edition of the MS’s offerings, because it would net access to their 64-bit C/C++ compiler and much stronger profiling tools, that are simply missing from the express editions.

The sad part, is that Visual Studio is the only software package I have seen Microsoft release in my entire life that, that’s worth buying…. lol. Not even their operating systems can say that much, from where I sit.

My thoughts on “Debugger Tips: 8 ways breakpoints can save your next software project”

Debugger Tips: 8 ways breakpoints can save your next software project: “Here are eight fairly simple techniques for using breakpoints and other features of your C/C++ debugger that can give you enormous power and visibility into your program.

Email this Article
Add to digg
Add to del.icio.us

An interesting article that’s worth the reading, for anyone who is ever going to get stuck running a debugger. Personally, I prefer log files and analyzing the code in my brain, but when it’s a task you can’t cram up there in grey matter, or you need to cuddle up to the run time—a good debugger is your best friend.

I’ve learned two things tonight

  • Windows Services are really easy to work with.
  • If you use a big enough shoe horn, you can fill in the missing parts built into most unixes.
  • There has yet to be any Unreal Engine based game made, that doesn’t suck. At least, that wasn’t developed by Epic.
I say learned two things, because I already knew the third :-P.
Those who don’t understand the third point, are likely admitting their incompetence.

Overloaded and still bit shifting

Ugh, I’m freaking tired. Started the day off computing what changes would be needed for setting up nmake based builds of Stargella, and everything has been on a snow ball since then.

I spent a considerable amount of time cursing at the MSDN Library over some very shotty docs, and realising that despite the overall quality of MSVC, the actual build tools behind it, has to amount to the stupidest ones I’ve ever seen. Although to be fair, the very first C compiler probably was worse, but this isn’t the ’70s :-P.

The depreciation of Code::Blocks for building things, and switching to appropriate make systems should mate more smoothly to my work flow. It also pisses me off that after all these years, the best tools for the job haven’t improved much. Unlike the typical morons^H^H^H^H^H^Hprogrammers I’ve had to suffer, I also know how to cook up a build set that shouldn’t be an almighty pain in the neck, just to use on another computer then my own work station. Applying basics of computer science to software construction, many hours; having to use tools that quadruple your work load, priceless!

As soon as I battle test one last makefile for MS’s nmake, all should be ready for committing to the repo. Then I can worry about the next goal, a proper merge of the common and system modules into a central core, shuffling the Windows builds to using DLLs (to match the unix builds), and integrate PCC into the unix build stack. (For ease of compiling dependencies, only MSVC is supported on Windows: MinGW users are on their own).

During the course of the day, I’ve also done plenty of the server admin loop, and have serviced more interupts today, then my processor sees in a week of abuse.

and all while carrying on several conversations, hahaa!

Got to enjoy one of my favourite films, Planes, Trains, & Automobiles.  In it’s own way, it’s an extremely well done film. Critically, what can anyone say, but it’s an awfully small world with a lot of ground to cross. Sense of humour required :-P.

Combine the tightest prick this side of Chicago with the worlds most annoying shower curtain salesman, and the inevitable twists of fate, and you’ve got the holiday trip home from hell. Much of the things that go wrong for Neal Page and Del Griffith, you could only choose to laugh at or cry about, if they ever happened to you, which they finally break down laughing on the “Highway”, once things finally go up in flames. It’s just something you’ve gotta sit through and watch to enjoy, while being thankful you’re not these two poor schleppen.

For how little it’s been on air over the past decade, I’ve been very happy to see it on often this past year. The movie also illustrate one of the few things, I actually like about English:

Car Rental Agent: [cheerfully] Welcome to Marathon, may I help you?

Neal: Yes. 

Car Rental Agent: How may I help you? 

Neal: You can start by wiping that fucking dumb-ass smile off your rosey, fucking, cheeks! Then you can give me a fucking automobile: a fucking Datsun, a fucking Toyota, a fucking Mustang, a fucking Buick! Four fucking wheels and a seat! 

Car Rental Agent: I really don’t care for the way you’re speaking to me. 

Neal: And I really don’t care for the way your company left me in the middle of fucking nowhere with fucking keys to a fucking car that isn’t fucking there. And I really didn’t care to fucking walk down a fucking highway and across a fucking runway to get back here to have you smile in my fucking face. I want a fucking car RIGHT FUCKING NOW!

Car Rental Agent: May I see your rental agreement? 

Neal: I threw it away. 

Car Rental Agent: Oh boy. 

Neal: Oh boy, what? 

Car Rental Agent: You’re fucked!

You can stream together arbitrary amounts of profanity, until you finally calm down lol.

Gimped from the start, but still an [SAS] man

From Game Scores Volume I

Annoyingly gimped in the first 15 seconds of the game, yet somehow managing to limp through one of the largest RvS missions, and doing it faster then I’ve seen some 4 man elements do it too, lol. Over the years, I’ve actually seen people take over 20 minutes to do that map, even when in a ‘hustle’.
I was supposed to be training Bajorq a bit on RvS, but ubi interfered, so I guess I got all the practice time :-S.

Well here I sit, working out my plans for the weekend. After sorting a few things on the spot, and spending some time on what I’ll just call a spurt of stuff to hammer out, so I doubt tonight will be highly productive. It’s the little bit of rest I get before I have to get cracking away on stuff in the morning.

Besides digging into a few thousand pages worth of book, in the chopping block includes:

  • Compile and test X.Org 7.5 on Dixie.
  • Finish my review of PC-BSD 8.
  • Complete the new auxiliary “Snipers” path on my Private Airport (kai) map.
  • A few SAS related details I won’t mention here
  • Get the plots sorted for the next 3 live ops (;)

Also, there’s something I have been meaning to do for awhile, refactoring the common and system modules of Stargella. The idea I’m thinking of, is merging them into a singluar “core”,  and numerous other related changes. Mostly in relation to compiler support and hooking in the render module. Two points of research, being to depreciate the FreeBSD/GCC builds with Code::Blocks in favour of a recursive GNU Make, and perhaps also NMake for Microsoft’s compiler. I would also like to experiment with supporting the Portable C Compiler (pcc), which entails either system modifications to FreeBSDs header files, or some redesign work on the games code.

I have multiple compilers available to me, several versions of GNU, the legendary PCC and Watcom compilers, and the express edition of MS Visual C++ 9.0. The only supported compilers for my project, are GCC and MSVC, under unix and windows based systems respectively. When ever possible things should be usable with most (reasonably) standards compliant C compilers. Not that Microsoft’s compiler is one. The main reason I don’t do mingw or watcom builds under Windows, is it’s just easier to go a strict MSVC route, when it comes to handling the dependencies. If it wasn’t for that, I would actually go as far as supporting as many compilers as possible.

One thing that really pisses me off, is even after 40 years or so, stitching together makefiles by hand is still one of the best ways to build stuff o/