Perhaps I'm a victim of great storytellers, or maybe it's my nostalgia for a time I never knew, but the idea of 'old-school' game development really has me stuck. My interest in programming was generated from several books I read about the early days of the personal computer industry. The stories of teenagers creating Space War on the PDP-1 and poking memory locations in the Atari 800 computer trying to figure out how to program it somehow infected me. Reading about the technical exploits of John Carmack building game engines that allowed a personal computer to create graphical experiences others thought impossible implanted a deep desire in me to fully understand the computer so that I might be able to wield such masterful control over it.
The intrigue of computer games is probably somewhat the same for all those that get into them, it's a virtual world that's different from the one you live in. Being able to create games yourself literally lends you the ability to generate your own universe from your imagination. Just this possibility makes the computer and an understanding of its inner workings all the more precious.
This all sounds great, but when I really think about it, what probably drove those early coders deep into the machine was the simple fact that what they wanted to do was very new; there was no information out there about the topic, so they had to figure it out themselves. If you're learning about computers or game development today however, you're more likely to be overwhelmed by the volume of information on it. This should be a good thing, but this is where I recognize the hang-up I've allowed to develop: no resource I find provides that hacker level, direct memory accessing, assembly routine writing sort of thrill I'm seeking, so I lose interest.
I'm not entirely irrational, so I'm aware that I'm unlikely to find any resources about writing graphics routines in assembly for modern computers. As such it seems completely rational to go out to Amazon and buy a bunch of '90s era books on graphics programming to try and get my nostalgic fix. This is where the problem arises. As soon as I get an MS-DOS programming environment all set up and get going, my rational mind starts up with reminding me that I'm wasting my time. Being that my ultimate goal is to become a software engineer both for technical fulfillment and economic prosperity, my subconcious begins to become incredibly irritated with the fact that I'm using valuable time that I could be using to build skills that will get me that dream job in software and I become anxious and unfocused, and begin trying to convince myself that trying to get a GUI up on the screen in Windows 2000 is somehow furthering my career in 2018.
As I've come to grips with the fact that trying to learn to program like it's 1992 is little more than a case of giving the wrong priority to what should be a hobby, I've also come to see the more subtle effects this tendency has had. For one, it made learning GUI development in Java miserable. I've felt like placing buttons on panels in Swing and coding event listeners was just the most lackluster exercise on earth. If I'm not writing a game loop and trying to learn buffer flipping, I feel useless. In the middle of what should be a study session to solidify my understanding of say, the AWT classes, I end up trying (and failing) to draw pixel art on a window with rectangle2d objects. Another thing I've allowed to go out of control is the tendency to dig for too much background information when learning a simple topic. I remember one day not too long ago as I was learning to implement a MouseMotionAdapter in Java, I ended up wasting a good hour scouring JVM documentation and the Web trying to figure out how Java gets the mouse position data from the operating system.
Some might say that's a good thing; I know better. Sure, those were the things that made the people in the books I read so successful, they were inquisitive, they devoured information, but the case with me is a bit different. To be fair to the journey, I've gained a deep understanding of computers from these nostalgic detours. For instance, I only understand a bit about operating system internals because when learning to write assembly programs in DOS you come to understand its structure as you're essentially in its guts, manipulating memory. So once you encounter the terminology you naturally begin to research what a "kernel" is and naturally discover Linux (and UNIX), download the source code to the original 0.99 version (out of nostalgia, of course), try to understand it (and fail), and end up researching "memory management" and a dozen other wierd terms from the comments. So, I can't possibly act like branching off into old-school stuff hasn't been beneficial, heck, it's the only reason I've made it this far. This is really just a self pep talk to get a little more disciplined and remember that I came to this craft as an adult while those I've come to idolize often had exposure to computers from an early age and as such were afforded time to explore. If I want to reach that dream job, I have to prioritize my time and attention properly. I have to learn to discipline myself to put hobbies in their proper place and not allow myself to use them as distractions when the main topic I'm focusing on gets difficult. I have to keep in mind that many of the most valuable programs out there have no graphical aspect to them at all, they're all logic, written by someone who has done the raw study and practice necessary to construct a solution to a problem that needed solving. Until I've done that work and achieved the title of Software Engineer, the contents of video memory are off limits; for me, those pixels are poisonous.
~Posted September 6, 2018~