Thoughts on recent SAS resignations

The more I look at the facts, the more I am convinced that this was a mutiny in spirit. Out of the first wave of people to resign: it appears to me that there are 3 core factions to it. One group, was overruled against by GCHQ, the other carries Rasa’s infection for placing their own ways above SASs, and then there’s Rouge.

When GCHQs ruling wasn’t to their liking, it seems that a few people decided that they know better then the ones in authority. That same “I’m right, GOD is wrong” attitude has built many a house on sand… I’m glad [SAS]’s was built on titanium reinforced concrete.

A few, are people who have stiff necks and know they will never have the subversion of [SAS] that they desire, a man who once refused to teach SOPs comes to mind. This is what I mean by “Rasa’s infection”.

Some time ago, even I lost all my faith in GCHQ, when it looked like a set of RvS-centric SOPs would be fast tracked through and every word I ever said to hold this clan together would become a lie. Every member of GCHQ proved themselves to me, except for Rouge. He lost that fight, and I’m starting to feel that his involvement is because he lost, and the ideal of [SAS] as one team, two games, won out.

So I suppose, having been given judgement, and deciding that the highest court in the land is not better then their own, a few people decided they can do better.  I really pitty Timbo and Scout, if they organise a clan under Rouges armchair generalship.

A little size of fun.

Generally, I’m rather finicky about making assumptions about the sizes of types, and even conversions between signed and unsigned types. Although I occasionally skirt dangerous gronud, such as feeding a function pointer into a object pointer, and expect to be able to cast the void* back to the function pointer (basically implementation defined by C, but required by POSIX), I also tend to make notes of when (fully aware) I do things that are non portable, but not necessarily obvious. At least in the example case I just mentioned, I didn’t know that was dangerous ground until I reviewed code under -pendantic, and scratched my head at the required warning message.

Normally I take things in stride, and just cringe when I see, “Portable” software doing stupid things like using unsigned int where they mean uint32_t, or the (flawed) assumption that a pointer to xyz will be as large as an object of xyz. So I thought I’d just take a look see here, and then wrote a program to display it in bits rather then bytes, since most of the folks I know will better get the picture that way :-o.

Also being a practical man, I kind of like to know what is off the edge of the map, just in case I some day have to jump off o/.

Here is a simple program to solve my curiosity:

 #include 
#include
#include
#include
int
main(void) {

    printf("sizeof(char)t= %u-bitsn", sizeof(char)*CHAR_BIT);
    printf("sizeof(char*)t= %u-bitsn", sizeof(char*)*CHAR_BIT);
    printf("sizeof(wchar_t)t= %u-bitsn", sizeof(wchar_t)*CHAR_BIT);
    printf("sizeof(wchar_t*)t= %u-bitsn", sizeof(wchar_t*)*CHAR_BIT);
    printf("sizeof(short int)t= %u-bitsn", sizeof(short int)*CHAR_BIT);
    printf("sizeof(short int*)t= %u-bitsn", sizeof(short int*)*CHAR_BIT);
    printf("sizeof(int)t= %u-bitsn", sizeof(int)*CHAR_BIT);
    printf("sizeof(int*)t= %u-bitsn", sizeof(int*)*CHAR_BIT);
    printf("sizeof(long)t= %u-bitsn", sizeof(long)*CHAR_BIT);
    printf("sizeof(long*)t= %u-bitsn", sizeof(long*)*CHAR_BIT);
    printf("sizeof(long long)t= %u-bitsn", sizeof(long long)*CHAR_BIT);
    printf("sizeof(long long*)t= %u-bitsn", sizeof(long long*)*CHAR_BIT);
    printf("sizeof(size_t)t= %u-bitsn", sizeof(size_t)*CHAR_BIT);
    printf("sizeof(size_t*)t= %u-bitsn", sizeof(size_t*)*CHAR_BIT);
    printf("sizeof(float)t= %u-bitsn", sizeof(float)*CHAR_BIT);
    printf("sizeof(float*)t= %u-bitsn", sizeof(float*)*CHAR_BIT);
    printf("sizeof(double)t= %u-bitsn", sizeof(double)*CHAR_BIT);
    printf("sizeof(double*)t= %u-bitsn", sizeof(double*)*CHAR_BIT);
    printf("sizeof(long double)t= %u-bitsn", sizeof(long double)*CHAR_BIT);
    printf("sizeof(long double*)t= %u-bitsn", sizeof(long double*)*CHAR_BIT);
    printf("sizeof(ptrdiff_t)t= %u-bitsn", sizeof(ptrdiff_t)*CHAR_BIT);
    printf("sizeof(ptrdiff_t*)t= %u-bitsn", sizeof(ptrdiff_t*)*CHAR_BIT);
    printf("sizeof(intptr_t)t= %u-bitsn", sizeof(intptr_t)*CHAR_BIT);
    printf("sizeof(intptr_t*)t= %u-bitsn", sizeof(intptr_t*)*CHAR_BIT);

    return 0;
}


The C standard defines CHAR_BIT in limits.h, as being the number of bits  for the smallest object that is not a bit field, roughly meaning that CHAR_BIT = number of bits in a byte, for all practical intents and purposes. Like wise, the sizeof operator is defined as returning the size of its operand in bytes, as an implementation defined unsigned integer value having the type size_t, from stddef.h. For the fuckos out there, the standard also says that a char object is large enough to store any character of the basic execution set (A-Z, a-z, 0-9, space, plus the required punctuation and control characters—roughly a character set of 99 symbols that fit within a single byte), and that those characters will have a non negative value while doing it. It also declares that sizeof(char) == 1. From this we can infer that sizeof(x) * CHAR_BIT should be the size of x in bits, and that ‘x’ is basically as good as off the edge of the map, for any x that you can’t make on my grandmothers type writer.

Having the size of each type followed by a pointer to it displayed, is mostly done to emphasis that a pointer to a size, means dick all about the size of the pointer. You’ll notice an interesting connection between pointer size and your hardware however. Gee, that just doesn’t sound right, LOL.

Some examples:

Run on FreeBSD 8.0-STABLE i386:

sizeof(char)    = 8-bits
sizeof(char*)   = 32-bits
sizeof(wchar_t) = 32-bits
sizeof(wchar_t*)        = 32-bits
sizeof(short int)       = 16-bits
sizeof(short int*)      = 32-bits
sizeof(int)     = 32-bits
sizeof(int*)    = 32-bits
sizeof(long)    = 32-bits
sizeof(long*)   = 32-bits
sizeof(long long)       = 64-bits
sizeof(long long*)      = 32-bits
sizeof(size_t)  = 32-bits
sizeof(size_t*) = 32-bits
sizeof(float)   = 32-bits
sizeof(float*)  = 32-bits
sizeof(double)  = 64-bits
sizeof(double*) = 32-bits
sizeof(long double)     = 96-bits
sizeof(long double*)    = 32-bits
sizeof(ptrdiff_t)       = 32-bits
sizeof(ptrdiff_t*)      = 32-bits
sizeof(intptr_t)        = 32-bits
sizeof(intptr_t*)       = 32-bits

and FreeBSD 8.0-RELEASE amd64:

sizeof(char)    = 8-bits
sizeof(char*)   = 64-bits
sizeof(wchar_t) = 32-bits
sizeof(wchar_t*)        = 64-bits
sizeof(short int)       = 16-bits
sizeof(short int*)      = 64-bits
sizeof(int)     = 32-bits
sizeof(int*)    = 64-bits
sizeof(long)    = 64-bits
sizeof(long*)   = 64-bits
sizeof(long long)       = 64-bits
sizeof(long long*)      = 64-bits
sizeof(size_t)  = 64-bits
sizeof(size_t*) = 64-bits
sizeof(float)   = 32-bits
sizeof(float*)  = 64-bits
sizeof(double)  = 64-bits
sizeof(double*) = 64-bits
sizeof(long double)     = 128-bits
sizeof(long double*)    = 64-bits
sizeof(ptrdiff_t)       = 64-bits
sizeof(ptrdiff_t*)      = 64-bits
sizeof(intptr_t)        = 64-bits
sizeof(intptr_t*)       = 64-bits

I also have access to 32-bit versions of Windows NT and OpenBSD running on Pentium 4-grade hardware, but don’t feel like booting the wintel tonight, I’m to comfortable with Dixie hehe. Perhaps I will run the program on other systems and implementations, for the sake of testing, and add it to this entry as a comment.

Bad omen or coincidence?

Something rather odd just happened to be about ten minutes ago. I was changing the dogs potty pads, when one of my grandfathers paintings on the wall, literally tilted to one side, going crooked. In my family, I reckon most people would call that a bad omen, but me, well, I have fewer superstitions, normally.

Took a step up on the step stool, and tilted it back, when I noticed that the hanging cord sliding on the hook, looked ‘thinner’ now. So I took it off the wall, remarking that I’m too short for this stuff, and examined it. Sure enough, the old wire cord was badly frayed, practically looking like barbed wire; the spot where it’s been hanging from for years, was quite noticeably narrower then the rest. I reckon the picture probably dates back to the 60s, give or take a decade; not sure when he did most of them.

Coincidentally, the painting is of Christ on a hill top with his arms out.

Silly thoughts about the history of C/C++ coding

Note, this is meant to be taken more tongue-in-cheek then seriously, take offense at your own expense 8=).

  • All the fun C library extensions can be blamed on UNIX, less intelligent systems still don’t implement them ^_^.
  • Most of the stupid type systems out there are reminiscent of the Win16 API, watch out for the Hungarians in the group.
  • Too much programming for Windows can lead to strange tattoos, that’ll get you sued unless you’re a pioneer.
  • There’s enough C++ Hungarian in the gaming industry, to make you want to go parler français with the ladies instead.
  • If you want to sit in your own little world, stick to mobiles or Macs; or ask the Free Software Foundation for this weeks KoolAid.
  • It’s alright if you need a road map to find your way through all that DDE/OLE/COM/ActiveX/DCOM mess and their variants, so does Microsoft: that’s why they invented .NET for your InterCOM.
  • Watch out for the babes in the null pointer cafe.
  • sizeof(foo) ≠ sizeof(foo*); unless thou hast vilolated the tenth commandment.

Wew it’s been a jumpin’ hopin’ day!

I was up all night fiddling with Code::Blocks and Stargella, plus work this morning, and ideas for an interesting project. However it’s a project that would call for C++, and I hate C++, lol. I like C, but hate C++… funny. On the upside, I’ve finally gotten Stargella builds sorted out, and I’m tempted some what to hunt down and eliminate ‘itches’ here and there, but I’m not sure whether or not patches would be welcomed for some things that seem to be, to be good ideas. They’ve accrued a fair number of open patches and bug reports, so I don’t think it’s a highly active project. The code base is only about sixty some thousand lines, not to big for all it does.

Sigh, nothing but work on the horizon o/. At least there’s something on TV tonight that I haven’t seen before, the original Night of the Living Dead. Geeze, what is it with all the zombie flicks lately? Compared to what I’ve seen of the remake (beginning / last half), it certainly has a more spirited beginning, bah I need some pop corn lol. I’ve no idea ottomh what they filmed it in, but it does give an interesting feel to the movie, different then most contemporary films of the era.

A “D’oh!” moment admid success :-/

Today launched operation breadsticks, a little mission meant to explore the idea, “What if the Haitian earthquake was man made?”. First map was Made of Stone, a map I haven’t really played since before joining [SAS]. I setup the `Tactical Operations Command` centre near the spawn point, and adapted the callsign Valkyrie 62, much like in the LO to China.

Everything went fairly smooth, search team rescued the secret hostage and moved deeper into the building, took a few hard to avoid causalities, and we aced the op. Where things went hairy, was the second map. For some reason only Escrt could spawn correctly and all the terrorists were missing :-S. So after plan B to hot fix it went belly up, I called for plan C and let the members have fun on the first map.

I’m taking the week off from everything except ‘work’ (lol), but intend to sort things in my rest time. Ok, so hacking up a map in unreal ed is not restful… but it’ll be worth it. Because we need to be able to spawn, I have to fix the map to *properly* support multiplayer, it was necessary to postpone map #2. Since there were no tangos either, I’ll have to create a proper scenario; in the mean time, I can just release a tangoless version that people can train on. Then when things are ready, any given weekend, we’re back in action!

🙂

The price of pissing me off.

A word of warning, this post contains quite a bit of profanity after the jump break

After being up until well past 0400 last night, setting up a decent SCons build set on FreeBSD, for testing— its usefulness to this project. I started setting up the required config tweaks today on the Windows machine.

When suddenly, I found a odd difference between how the same SCons versions behave on FreeBSD/unix and Windows, in fact, this kind of thing is why I’ve given up on the more popular CMake.

lib = os.path.join('#', outdir)
#
# For some really mother fucking stupid reason, Glob('*.$OBJSUFFIX') works
# on FreeBSD scons 1.2.0_20091224 but not on the Win32 scons 1.2.0_d20091224
# from the installer. So we have to use this fucking method, which will require
# us to wrap it in exceptions in case OBJSUFFIX is None or missing totally on 
# platform X.Y.Z. from fucking hell -- I fucking hate software build tools.
#
src = None
try:
    src = os.path.join('#', outdir, '*'+str(env['OBJSUFFIX']))
except:
    print("Fatal error: I can't make libs from object files without a file "
          + "extension. TERMINATING THIS SCONSCRIPT")
    import sys
    sys.exit(127)

env.StaticLibrary(lib, Glob(src))



# edit: an explanation of why subst() isn’t used in place of the [] look up: once you violate the consistency part, I’m not going to trust that there will be an $OBJSUFFIX to subst() on.

I was going to make most of this code a simple library function for the sake of code reuse, instead of having a virtual copy/paste of the same few SConscript files between all my games source modules—after this, I’ll stick to having one of those comment blocks in each source modules SConscript file as appropriate.

Generally I like SCons, it’s rather similar to what I’ve invisioned as the “Perfect build tool”, even though SCons still requires about 3 x the amount of work for Stargellas build system then tmk should. However, SCons is here today, tmk won’t be for a long time. What annoys me, is that SCons is still just as bad as all the other build tools out there.

For crying out loud people, it is the year twenty first fscking century…. and asking some consistency of our modern build tools, still demands limiting ourselves to the 1970s era make tool from PWB/UNIX.


So busy, food even fell by the way side :-S

I spent yesterday experimenting with DevIL, I rather think that it will be more “To my taste” then using SDL_image for getting easy access to various image files. I only expect to use certain image formats in my games kit, but ahem, if that ever changes, who wants to write per-image-type handling code anew? Hehe. In playing with that, I’ve uncovered a bug in my zipped package handling, and have coded myself into a corner, where in the system would best serve me, by growing a more abstract file handling API—exactly the direction I wanted to go in, hahahaha! TBH, I’ve only ever coded myself into two corners in my life, but this ones a happy case.

Most of the night was used in experimenting with a pair of Integrated Development Environments, CodeLite and Code::Blocks, as possible replacements for existing build.{sh,bat} scripts. There is also a Visual C++ solution and project tree that I whipped up Thursday to replace build.bat, and am still cursing at over its dislike for by build tree o/.  In terms of IDEs, Microsoft has the best, but like visually everything at MS, it’s also as vendor specific as writing machine code in octal 8=).

Really, I can care less about having compiler, editor, debuger, … all integrated—other wise I’d use GNU emacs, screen, toolchain, and their dialect of C, and spit on everyone else’s graves >_>. The thing that interests me is having a suitable build system, one that can handle multiple compilers/configurations and multiple OSes readly, because `tmk` is a project on my back burner, and won’t be completed anything in the short term. Code::Blocks like modern MSVC/VC++, the file format is XML driven and quite easy to modify in a standard text editor.

Installing various libraries on Windows has shown me – most developers don’t know how to create a build system. At least, most open source developers can’t seem to combine computer science 1.0.1 with Visual Studio projects.

Installing various programs on Unix systems, has shown me – many developers don’t understand autotools or it’s just to complex a build system (I vote for both).

Me, I say fuck y’all. Do it right or don’t let it leave your workplace.

And so passes another black day

I’m rather glad to be heading back to work, and happy to see the 12th go. January is generally regarded as an off month in my family… January 12th, 1989 is the day my father died, as well as my parents wedding anniversary—what a peachy combination!

While I try to respect my families history, it’s also a matter I try not to dwell on for very long. *sighs*.

Being stuck with walking around the supermarket (baah, humbug!), I tried to devote my trains of thought to Stargella. For me, it is kind of a difficult project; much more complex then most of the things I’ve worked on.

In thinking about how to structure the rendering code, I started to think more on the issue of dealing with the data sets to be rendered, which basically means sprites (2d), models (3d), and rendering the level (2d/3d), and I want as much of the code to be shared between several game projects, which makes baking the lasagna a bit involved. Since my current focus is on the 2D oriented, Stargella, I find it wiser to concentrate on sprites then three dimensional models. At first I was thinking, it would be ideal to represent animations for sprites, as a simple image stack; but I don’t want to go through all that for unanimated sprites, which would still require a singular stack frame. The fix to that is simple, only use stacks for holding the animation details! Then I was thinking, gee, wouldn’t it be cool, if we could attach some type of script or shader program to be run, so a level of progmmability could be used in implementing special effects. For example, using program to stake out the centre of a starship, split it into segments and have them drift away from one another, while interleaving it with an explosion-like image or particle effect. Simple enough to do from C I’m sure, but I would much rather have it be apart of the game data, then the game code. I’m not sure if that would be doable from mating with GLSL files, or if it would mean binding a script with the sprite. Either way, as far as general game scripting goes, it should have some level of access to the graphic stuff as a matter of completeness.

For now, I think I’ll concentrate on resource handling. It should be easy enough, just create dynamic arrays as needed (e.g. of loaded textures), and each time a new texture is loaded, assign it an unique identifier that indexes into the array; …

Honestly, I don’t have a problem with programming in C, other then it can be time consuming at times o/. But, because I have to learn OpenGL along the way, it is, shall we say a freaking lot of docs to parse.

While I’m more familiar with Direct3D then OpenGL, even if you can still get a good bang for you buck out of a DirectX SDK without having to stoop to C++, I’m not willing to write a DirectX backend for my game. Obviously, I want to avoid using glThis and glThat all over the game code, but ahem, the issue of shaders enter the picture. Short of making a definitive choice at compile time, that would have to live for the projects entitiy, the only way to support runtime choice, would also require writing both HLSL (DirectX) and GLSL (OpenGL) code, and that’s a headache I won’t put up with!  As much as I favour the idea of nVidia’s Cg, it is not available on enough platforms, and implementing it where needed, to much trouble.

If people other then me want to play the game, they can buy a graphics card made in the 21st century and install its drivers >_>.

—- this post has been interrupted so many damn rat fucking times by family, that it is unfinished and will likely always remain so. This is one of many reasons why I fucking hate my life.