The RPG from Hell
Role-playing games seem to have an obsession with underground lairs full of monsters. (Dungeons & Dragons even has it in the name.) As a kid, I remember watching my brother playing Diablo II on his PC. Everything was droopy, gloomy, scary and dark. -- And that was just at the surface level! Then he descended into one of the dungeons, where the real action was! It was oppressively horrifying to my childish soul. There I sat with my eyes glazed over, my clammy little hands clenching the sides of my deeply unfashionable corduroy thrift store pants, transfixed yet wondering: Why would anyone want to go down there into those dungeons? There's nothing there! Only monsters! Why not just stay at the surface? Maybe hide, maybe get the Dark Lord's phone number and broker a deal from the safety of the village? How would that sound?
This is how I generally have felt when approaching a new piece of technology. Transfixed, but wondering how many dead-ends, unsolvable problems and wasted hours await me. I'd much rather spend time at the surface, learning about the theory, capability and higher-minded concepts. I do not, as a rule, favor getting my hands dirty.
In other words: I am not a tinkerer.
For this I have long considered myself at a serious disadvantage compared to many of my software engineering colleagues. A disadvantage only to be made up by technological brilliance. A brilliance so epic, that... I surely must falter at implementing it. This is the tragedy of my professional life, of which I have come to sing:
I don't do computers
I grew up a bookish kid, by all accounts destined for the humanities. My first and only failing grade in middle school was a remedial computer class. I didn't play with Lego Technic. It had never occured to me that anyone would care to take the washing machine apart, just to find out how it works. When my best friend did just that and got into hefty trouble, I was astonished and admired him greatly for it, but was under no illusion that such a path was for me.
But there was something that drew me towards computing machines, despite all the odds: Ever since I picked a losing fight with the boss kid in kindergarten and found myself no longer welcome to the sandbox, I have been interested in power; How it works, who has it and how it may be gained. When I say "interested", of course I mean academically interested. Let's just say that a lot of my social experiences in childhood were of the, ahem, "outside-looking-in"-variety, if you catch my drift. Socially shunned and ill-adept, what's a greasy-haired kid to do? -- My answer, like many a kid, was to make my thoughts my own best friends. The world with all its physical objects was too scary and frankly just too heavy for my scrawny arms. But not in my fantasy, there I happened to be in the deadlifting business.
Enter computers: Again, computer games remained the domain of more adept individuals than me. Reflexes, practical mastery? Not my cup'a tea. Most games were also just too damn scary for my gentle soul. No, I had to stick to something less fast paced if I was going to keep up. It was the concept of programming that intrigued me: Here is a device that could bring thoughts to life! Concepts from my mind finding embodiment -- what a heady idea! At last I would find myself in power in a domain where I actually felt at home. It didn't hurt that society was talking about tech as the way of the future. -- Furthermore, who knows -- I might even manage to talk to girls? (A popular non-sequitur if there ever was one!)
So, as an angsty teen around the turn of the century -- with my remedial computing knowledge -- I set out to build an operating system for AI. I invented an xml-based "declarative" "programming"-"language" which I compiled to x86 assembly using a healthy pile of regular expressions in a sauce of Perl. Don't go looking for my usenet posts from back then. Please. Not much came of my youthful software ambitions, though I did learn a thing or two about Perl and Java along the way.
More impactful, surprising, and in a completely unrelated chain of events, I started talking to girls, which largely involved a few brave (or severely congested) souls willing to smell past the pronounced body odor problem I was nursing at the time.
Math for Rubes
I enrolled in Cognitive Science as an undergrad, hoping to do something between Philosophy and Computer Science, without perishing of the stuffiness of either. Little did I know that I was going to discover a pearl that had long been hidden from me: The joy of Mathematics and Logic.
As a child, I had viewed Math with the same disdain, fear and shame as I would a Rubik's cube: An inscrutable jumble, promising nothing but joylessly toil and failure. Rubik's cubes were not a part of my world.
But now I learned that I didn't really need to be a tinkerer to do Math. I just needed to be a committed thinker! It was a realm of thought, with logic as a worthy opponent. What a joy it is to say: What I want to do is impossible, logic does not support it. However, what I want to do is reasonable, what can I do instead? Perhaps I can bend my requirements, change my perspective to express that which I want to formalize it in such a way that logic will accept it nevertheless, if begrudgingly.
My advice for improving STEM participation: teach Math as the formally rigorous branch of the humanities that it is!
Machine Learning doesn't work yet
Next to Logic, in my studies I soaked up statistical NLP, Good Old Fashioned AI as well as Neural Nets. This was back in the day when Support Vector Machines were de rigueur. But there were some whispers around the edges about so-called "Deep Belief Networks".
The year was 2007 and I was going to build generative language models using auto-encoders. Only problem: Deep back-propagation didn't really work yet -- and you had to build it yourself, so if it didn't yield results you were never quite sure whether it was a bug or shortcomings of your algorithm that was thwarting you. I, for one, had a lot of bugs.
I failed at generative natural language models, proceeded to (re-)invent adversarial networks, but failed to use the new training algorithms that were pouring out of Geoffrey Hinton's lab and instead (in a stroke of self-defeating genius) decided to roll my own Evolutionary Algorithm. None of my supervisors thought it worth publishing by the time I managed to limp across the graduation line. Frustrated, I concluded that there wasn't much future for me in academia, and picked up web development.
Engineering is an anti-pattern
Remember, I was a Cognitive Science major. While I could nail together scripts, I didn't really know how to write software! But, the myth of Silicon Valley was wafting through the air and it drew me in, much like our dog will pull at her leash if she smells a pee-stain across the street.
I was going to build an app and it would be successful. -- What if I built a unified messenger app? That's something I don't know much about!
Unfortunately around the same time, health issues were taking over. I had been in chronic pain for a while and with limited attention span I had skated through college and grad school mostly soaking up information osmotically from lectures and classmates. Sitting down and reading a technical book was quite the undertaking. Sitting down and writing -- nay debugging half-understood code was the ultimate nasty, dark and gloomy dungeon to descend into. And yet I tried.
Debugging is a curious beast. As coders we will run into situations where we don't know how to even start at least once in a while. Problems and bugs where we have no clue. As a neophyte we will run into them all the time. What it takes to deal with those situations is incredible steadfastness, willingness to try many things and a breadth of experience.
People with a predisposition to start small, prod, experiment and generally muck around at are a distinct advantage. They get into less inscrutable dead ends, gain a breadth of experience much more quickly and generally build stuff that actually works instead of stuff that sounds like a good idea. Through my bleary eyes, staring at the editor, it dawned on me: If I wanted to make it in software, I had to learn to imitate those kinds of people: tinkerers.
The consummate tech CEO
One must imagine Sisyphus as an ancient Greek Elon Musk. Really -- have a look at the Wikipedia page: The similarity is striking. Sisyphus was a busy-body. Before his punishment, he was running around Hellas, out-innovating mortality, squeezing the Olympian shorts placed on him by Zeus, generally being hyper-active and full of himself. The craftiest of men. Vapid yet gargantuan projects such as Mars colonies, saving our wasteful car-driving lifestyle in the face of reality or rolling rocks up a hill are truly a just form of punishment for that kind of guy.
I was not that kind of guy. I was a tech wannabe. I would work on ambitious projects: Unified messaging, or a document database with indexing based on anti-unification, or a reddit clone where subreddits were replaced by embeddings into a non-euclidean space. That kind of stuff. Unlike those Muskian achievers I didn't make it big. I actually didn't really make anything that people actually got to see and use. Much like Sisyphus' punishment the stone of my attention would always slip -- before I had a chance to ship.
There's no way to sugar-coat it: I fundamentally confused writing software with yak shaving. I would push myself deeper and deeper into theory and prescient optimizations, until the project just kind of became inert... sat there a while... and then I slowly, slowly, imperceptibly gave up on it.
How amazing is it that for his punishment by the Gods, Sisyphus basically gets turned into a me! Before, he's achieving, he's meeting deadlines, he's outrunning the greatest deadline of all: Death itself. Afterwards, he's striving, striving, striving... But without results.
It is the moment of failure in this myth that truly intrigues me. That instant where the rock slips out of his hand and rolls back down. Surely, he must get it, right? But no, he doesn't get it. The slip is but an afterthought, an unsolved riddle to be glossed over. The Sisyphus solution is: try harder.
I, for one, have tried trying harder. To build a successful product. And I've failed at it.
But en passant to rolling up my own hill of technological hubris, I actually did learn enough jargon to pass tech interviews.
You know how interviewers used to use fizzbuzz to weed out people who couldn't write code? Lucky for me I did know how to solve fizzbuzz. I actually knew more than that: I was able to talk intelligently enough about advanced technical topics. All those fancy technologies I had been fantasizing about. Problem was: I couldn't for the life of me get a simple software project across the finish line.
Indeed, I was a 0.1x programmer trapped in the body of a conversant technologist. Suffice it to say: I had trouble holding down a job. But I could pass interviews, so I could get a new one. Bit by bit things changed; I learned to stick it out. I was blessed to meet some of the kindest people in tech who endured me and my meandering incompetence. If there's any moral in this sordid tale: Talking can help and hiding can hurt.
We do not actually live in a world of mythical punishment. Simply by showing up and failing, after a while, I somehow acquired enough practical knowledge to actually do my job. In a mysterious way, many important things happen while we're not paying attention. Sometimes, all it takes is the follow-through to keep doing it. Which is really the benefit that tinkerers have: They'll just keep on playing around with something and gain experience while doing so.
My difficulty is that I will be scared and I will freeze and shrink away from taking action and gaining hands-on experience with the code beast. So now, if I'm trying to learn something expansive and complex, I often choose to first work on stuff that's related, yet non-threatening. It's a true art-form to find preparatory projects that don't feel like homework. I want to be concerned with the deep and conceptual, not the fiddly work! But finding just the right on-ramp for my proclivities and capabilities can be an analytical task! And if there's one thing I can do, it's analytical tasks.
My mind will always tend towards analyzing and conceptualizing over acting and manipulating. So I have developed tricks to transform activity-heavy work into a sequence of analysis interspersed with non-threatening manipulations.
A warm and fuzzy codebase
It all came back to that fear of dungeons. Familiarity dissolves the fear. When I was watching my brother playing Diablo, he wasn't overwhelmed by fear; He may have had some spooky excitement, but mainly he was enjoying a feeling of mastery and accomplishment, on the basis of a certain sense of safety -- of knowing how the game works.
Nowadays, I sometimes can even enjoy tinkering around a little. I still find it hard to focus if I can't relate my task at hand to some larger concept. One key factor in this professional growth has been to respect my own brain and its peculiar workings. If I sit myself down with nothing but a vague notion of what it is I'm about to do -- and expect to achieve results, then I'm setting myself up for failure.
I have had to learn to: 1. Not do Engineering -- As best I can to avoid the compulsion to come up with "the right solution" and dwell on the conceptual. 2. Manage my fear of finding out how the machine actually works -- familiarity breeds contempt, and contempt is what you want when setting out to slay dragons. 3. Work with the kind of person that I am: I cannot simply sit down at a keyboard and write code freeform. I need a rough and realistic plan for what I'm going to do algorithmically. 4. Know what to punch into Google when there's a problem.
I have come to taste the joy of doing manageable tasks in a code-base that I know like the back of my hand. Sometimes my requirements still turn out too steep, concepts too abstract, implementations too tricky. It remains a struggle not to drop the rock.
But that's how I roll.