Understand Primitives

Broadly, the tech industry uses the word “primitives” to describe the base components of something, such as 0 and 1 or a JavaScript library. “Primitives” can also apply to default/standard things imported as-is into a program before you add to it.

Most of the tech industry doesn’t have a natural intuition for primitives:

  1. The computers industry itself is a sprawling realm of endless specialization. Computers are a subset of electronics, which is a subset of a subset of electrical engineering, broadly grouped as electrical work.
  2. Inside the computer world, hardware and software branch into their own subsets, with software development as a further subset of software, with programming underneath that, and data science and cybersecurity straddling alongside software development. This doesn’t even scratch the many related domains of sound engineering, web design, graphic design, CAD design, and all the other inter-related disciplines of engineering that passively mix in. All of these have uniquely different cultures.
  3. Very frequently, the tech industry is disproportionately loaded with mid/high-functioning ASD (autism spectrum disorder). One of the idiosyncrasies of ASD is an absolute obsession (and often mastery) of certain domains, with a complete disregard for related domains.

These factors connect with each other. Highly gifted ASD have brains that tend to explore precisely what they like (e.g., app development) and disregard everything else (e.g., OS development). The culture across most tech domains reflects this.


The granular nature of tech industries creates a risk for our brains. We tend to learn much more by connecting two bits of unrelated information than from related information. Learning a little bit about each protocol, for example, will be more broadly useful to understand networking than learning every little detail about one specific protocol. However, that broad understanding can come at the expense of granular, money-making skills in the short-term.

It’s not as productive in the short-term, so learning little bits of everything isn’t generally as fashionable as learning specialized sub-sub-niche information:

  1. Learning is hard, but copying is easy, and most of the modern educational systems (at least in the West) promote rote memorization over understanding. It’s easier to gather and regurgitate trivia than distill it into something else.
  2. The tech experts have some discussion about “soft skills” (e.g., customer service, teamwork), but most of the business world seems to treat workers like a fixed cog instead of an adaptable memory metal. Soft skills are the career aspect of that memory metal.
  3. If someone is learning concepts and ideas, they’re not necessarily “making” something. Craftsmanship requires both hands-on skills and cerebral work.
  4. “Tribal knowledge” is the broad list of endless small details that nobody bothered to document. It’s a big part of why we can’t easily automate most labor, and why skilled professionals can often philosophically extrapolate across domains more easily than unskilled people can become skilled.
  5. Well-designed technology gives the luxury of not having to understand how things work.


As living organisms, all our creations are simply remixes of everything else. We draw from the environment and make something new with it, which is often more elaborate, and is frequently more useful.

This moves itself across domains. Some of the most brilliant hacks come from the combination of, say, the need for people to take a one-way trip and peoples’ desire to work a side job, or mixing the need for space-heating and the extra heat from mining crypto on a basement server, or the desire to run a website off a Raspberry Pi.


Technically, brilliant ideas aren’t as clever as they first appear. Someone merely spliced together two seemingly unrelated things in a more productive way. There’s brilliance in what they did, but not in how. They were simply educated about 2 unrelated things at once, and saw patterns they could exploit.

Primitives are elegant and simple. A few tech primitives, for example:

  • Machine code is purely logic-based, compounded into 2-based numbers, so everything in computers is gradations of black-and-white true/false (with the exception of quantum computing).
  • All automation (such as programming functions) are a technical abstraction of how habits work, which mean it runs the same abstracted cycle of formation and adaptation. It also eventually creates catastrophe when the anticipated triggers/inputs deviate too strongly.

Our time on this planet is limited, so we must be choosy about the information we gather. Some information, such as every network protocol, is trivia, and much of it may be outmoded in a decade or two. However, other information (e.g., web standards, audio transfer considerations) are nearly timeless.


Tech primitives, however, are often hard to learn because understanding them requires rewiring your thinking. For example learning base-2 math instead of base-10 is not typically intuitive, but saves lots of time when figuring out memory allocation or IP addresses.

Nobody wants to hear about the dull sausage-making of success as much as the glamorous get-absurdly-rich tech entrepreneur story. The myths we want to believe imply a brilliant and fashionable solution can be the resolute source of all our answers, so the hard work required with technology often receives a hard pass. Even automation can be more work than the labor itself it there aren’t enough repetitive iterations of tasks to perform!

Several factors come together to make technology primitives especially hard compared to learning most other things:

  1. Computers are completely analytical machines. They only interpret logic. They’re highly consistent, but when they fail they suck at debugging themselves. They always do what they’re told, and have purposes with no concept of meaning.
  2. Humans, the users who use those analytical machines, are feeling and dynamic living organisms. They’re radically inconsistent, but when they fail they’re very good at error correction, in themselves and others. They often don’t do what they’re told, and sometimes say they did even when they obviously didn’t, in conformity to their ever-shifting purposes.
  3. When humans are exposed to computers, they tend to anthropomorphize the experience. The computer will take on an aesthetic personality based on the user’s bias, the user’s preferences will reflect in the computer and dictate how that person feels about the computer, and other relative concepts.

Anyone in the tech industry is torn between two worlds: an isolated experience of being around the world’s fastest and most perfect idiots, related to the interpersonal experience the rest of humanity loves to partake in.


While most things in computers change around, some things don’t. If you understand those things, the endless slew of over-information that destroys our sanity won’t sweep you up with it. But, this isn’t easy, so it’s not frequently popular.

But, if you’re endlessly curious, you’ll enjoy the hard work. If you’re not, save yourself the misery and avoid that work.

Working to understand the base components of computers will not pad your resume. But, it pays for itself:

  • A 10,000-foot view of what you’re working on makes learning easier.
  • You can sidestep fashions that will render themselves obsolete within 5 years and avoid wasting time.
  • A simpler view of everything permits you to see absurdly simple solutions for complicated problems, which will make you look like some sort of genius.
  • All this saved time will mean you’ll live a more complete, fulfilling life.

Of course, it’s not the rush of endless stimulation that comes standard with the tech industry, but everyone eventually gets too old for that way of life.