Reflections upon my career

For the most part, I’ve never been a big believer in bucket lists. At least not the kind you wait until you’re dying to start checking off. In thinking recently, I’ve come to realize that the work I’ve done over the years probably checks those kinds of boxes on my career in software engineering.

Over the past 16½ years of programming: I’ve …
Followed as part of a larger group where the big picture issues were someone else’s problem, been the mythical programmer doing it all, led small groups where the big picture is my domain, and been the contact point for small groups.
Learned that I like design and architecture. Both creating them anew and studying existing projects.
Somehow ended up the guy everyone asks when they don’t know the answers.
Gotten to enjoy coffee machines that may have had more moving parts than my car.
Worked on traditional application and system level software, but also many other pieces that were off the beaten path. Kernel level drivers that needed porting, microcontrollers that drive hardware interfaces, developed libraries, tools, and frameworks.
Discovered those are all less magical than you think when you’re a young padawan. It’s less that it’s drastically different from normal software development and more that it’s important that you not screw up, explode, or paint yourself into a corner.
At times been both the smarted and the stupidest person in the room.
Made features people loved that were based off my ideas. Especially the curious ones when I wanted to know how something worked, and then found an imaginative use case for what was learned.
Made features people loved that we based off other people’s ideas. Especially the ones that made the product better for the customer.
Been one of the engineers that gets called when a customer goes down on a Sunday.
Been deemed the expert on some problem domain. Actually, I don’t want to know how many times that’s happened.
Seen code that I worked on make the magic happen and seen the results on a scope, even though I’ll never be able to spell oscilloscope from memory!
Been grateful for hardware engineers and technicians and their skill sets. As well as gladly working alongside them.
Had my hands in more than pieces of internal infrastructure than I can count. As a coworker recently pointed out, while “IT guy” has never been my job title at any of the places that I’ve worked, he noted that I could probably run an IT dept it I had to. The part of that bugs me, is he was serious, and others agreed.
Been a webmaster, not that I miss that job.
Gotten to work with equipment that I always thought was so expensive that I would never be allowed to touch it.
Seen more than one 8-inch floppy diskette.
Oh wow, satellites!
Been one of the guys who knows too much about what needs doing after the power comes back on.
Both saved the day like Mr. Scott and reminded people that I am not in fact Scotty.
Quoted Jurassic Park more times than I ever thought possible.
Had to wear both my red shirt and my brown pants.
Kept working on a problem everyone else gave up on, and actually found a solution.
Written code to handle parsing existing formats and data streams, including at least one parser of MPEG2 Transport Streams and various propriety things.
Written code, specifications, and documentation for formats and data streams I’ve created. Sadly, more often for propriety things.
Debugged more than a few weird problems.
Been the guy that gets to solve a problem because the team that should fix it in their project decided it’s too much work to do the right thing.
Solved problems at both ends so a system is tolerant if only upgraded one end.
Will probably forget more about the X Windows system as I get older than younger folk will ever learn.
Will never forget there was a character encoding named EBCDIC because test equipment was so much older than I was, defaulting to EBCDIC rather than ASCII made sense when it was manufactured.
Worked on existing and developed new products that actually get used and deployed.
Never got to go to tradeshows and conferences related to my fields but was the chief code monkey on a product that got an award at one.
There will probably be at least letter from a customer in my keepsake box.
Ahh. I’ve got to admit, it hasn’t been a dull career to date.

Ever since getting the Raspberry Pi Pico, there have been two experimental projects in the back of my mind.

The first is of course: how to run DooM on the Pico. Based on what I’ve seen, I suspect the main point of suffering would be the limited ram compared to a i486 machine. Most of the console ports back in the day managed to show horn things into fairly modest systems, and I bet the two cores would work great for doing video/controller input on one core while the actual game runs on the other. What I haven’t been able to decide on is what path to take to explore that project. In my mind: I kind of see it as a more “Game Boy” like hand held with a screen and controls than anything else. I certainly don’t want to do ASCII doom over COM port :P. It would also be preferable to have separate storage that can address the storage capacity of WADs without having to cookie cutter a level into available flash, making the hand held style even more appropriate.

Second is building what in essence would be a personal computer. In essence a lot like ’70s kit computers such as the Altair, but imagined through the eyes of a geek that grew up in front of an MS-DOS machine. It’s stuck in my head a while that the Pico is far more powerful than the early CP/M and DOS based systems, and that it isn’t that complicated to connect the pico to external devices. From the prospective of fun, I think it would be neat to design a simple system around the Pico and built out something like a PC around it. On the downside, while creating a disk operating system in the vain of CP/M isn’t that big a stretch: I can’t really say that I fancy bootstrapping a toolchain to write programs for a custom operating system. But it’s an idea that keeps floating around whenever I look at how powerful the Pico is.

As a side note, I kind of wonder how hard it would be to replace the CRT in an old Macintosh SE style case with a similar sized LCD panel. While gutting the rest of the insides, and just using it as the mechanical environment to mount stuff. Really, I’m not sure if that’s brilliant or sacrilegious of such historic machines. Although to be fair, people have done some strange things with the cases of old busted Macs over the years….hehe.

Signs of a simpleton having fun with a new microcontroller:

  1. Write a program that makes the LED blink like a mother fucker.
  2. Write a program that spams a hello world to USB serial.
  3. Write a Read Eval Print Loop over USB serial.
Compared to what I’ve done in C with simpler micros like the 8051 family, I’m finding the RP2040 really damned nice. Not only because of the Cortex M0’s horse power, but because of the really nice library that comes with the Raspberry Pi Pico. For the hell of it, I decided to abuse it with some simple C++ by for the REPL just to see that C++ I/O and string handling, does in fact work.
Of course, me being me, I ended up with a really simple set of commands:

static string evalline(const string& line)
 {
     if (line.empty())
         return “”;
     if (line == “monkey”)
         return “Willow?”;
     if (line == “monster”)
         return “Corky?”;
     if (line == “sweet”)
         return “Misty?”;
     if (line == “help”)
         return “Try nicknames with fur”;
     blink(100);
     blink(100);
     blink(100);
     return string(“Unknown command: “) + line;
 }

Because why not? 😜

Huawei’s HarmonyOS: “Fake it till you make it” meets OS development

Forking and building off Android is what I would call the natural response for Huawei’s situation. And pretty much as long as you respect the open source license agreements and such, nothing is wrong with that. The ability to do so is one of the best aspects of Android; the going it without the Google add ons one of the reasons fewer people do that with actual phones.
Ron kind of says it neat here:
Forking Android and launching your own rebranded operating system is totally fine. But be upfront about that. Say “HarmonyOS is a fork of Android” instead of “HarmonyOS is not a copy of Android.” Don’t call HarmonyOS “all-new” when pretty much the opposite is true.
Where the real ire of the story lay, and perhaps justly so. But the conclusion towards the end of the article also makes sense. Inside China the Google’less Android idea works, as an international product not so much.

As a young programmer, I think it was probably around 2.4 or 2.2 that I started to use Python. Today it remains one of my favorite languages.

I remember thinking quite highly of the Zen of Python at the time. Much time as gone on since then, and to be frank, I think it even more beautiful and dead on balls accurate today than I did then. The older I get as a programmer the more accurately it reflects the reality.

Beautiful is better than ugly.
Explicit is better than implicit.
Simple is better than complex.
Complex is better than complicated.
Flat is better than nested.
Sparse is better than dense.
Readability counts.
Special cases aren't special enough to break the rules.
Although practicality beats purity.
Errors should never pass silently.
Unless explicitly silenced.
In the face of ambiguity, refuse the temptation to guess.
There should be one-- and preferably only one --obvious way to do it.
Although that way may not be obvious at first unless you're Dutch.
Now is better than never.
Although never is often better than *right* now.
If the implementation is hard to explain, it's a bad idea.
If the implementation is easy to explain, it may be a good idea.
Namespaces are one honking great idea -- let's do more of those!

Baldur’s Gate 3 devs built a testing AI. Then, they tried to defeat it.

That is kind of neat, and likely worth the effort as it grows more capabilities for abusing the game.

Television, novels, and comics tell us one day the super cool A.I. will be super smart, and may or may not try to kill us all. Personally, I think the future looks more like a series of special purpose constructs aimed to help us with specific tasks. That’s the super-cool A.I. I’m looking forward to, because I’m probably going to be dead decades or centuries before we see anything like Cortana or Jarvis, lol.

New safety rules in C++ Core Check 

While the for loop part may be a bit spotty, in the sense you should be paying attention and know which you want when writing the statement. I have to admit I would have liked the auto thing.

I’ve mainly relied on C++14 since about 2016 for my C++ use whenever possible, and sometimes newer because of standard library. While catching up wasn’t as painful as I thought, aside from how long it’s taken for C++17 to trickle through various Linux distributions, there were a few things I found myself watching closer.

One of these was the way auto typing copes with references. I learned pretty quickly to watch my syntax  when using auto. If a real warning was available: it would have been much faster to notice the distinction. In my code, you’ll usually see auto used when the type would be redundant salt. But we usually want our sugars to not to have unintended pains, lol.

The code samples on the switch/case part also makes me remember just how verbose C’s simple switch/case statements are by today’s standards. Damn it, now my kind is flashing to SML….lol

My cxxversion and stdheaders

 In various bits of code: I’ve been using these headers to solve the “__cplusplus says yeah, but actually compiling says go to hell” problems. Because sometimes the macro tells you one thing and actually trying to preprocess and link makes naughty hand gestures on the systems I encounter.

Finally, I’ve gotten around to sticking these in their own repo.

Which is much nicer than going: “Which project did I last update that in?”

Passing Thoughts: BIMF

Someday someone needs to create a build tool called BIMF: Build It Mother Fucker.

Bonus points if you get Samuel L. Jackson to provide voice over for your error messages 🤣

On occasion, I read (and see) things and am amused. The graphic at the end of this nVidia post on Vulkan Memory Management is a key example.

The Good: a memory allocation for a big assed buffer with your stuff.
The bad: a memory allocation for a bunch of buffers with your stuff.
The ….: a bunch of memory allocations for a bunch of buffers with your stuff.
Yes, I am easily amused. No, I am not typing those wonky symbols.