Broadly, the tech industry uses the word “primitives” to describe the base components of something, such as 0 and 1 or a JavaScript library. “Primitives” can also apply to default/standard things imported as-is into a program before you add to it.
Like any hyper-specialized existence, most of the tech industry doesn’t have a natural intuition for primitives:
- The computers industry itself is a sprawling realm of a broader niche. Computers are a subset of electronics, which is a subset of a subset of electrical engineering, which is broadly grouped as electrical work.
- Inside the computer world, both hardware and software branch into subsets, with software development as a further subset of software and programming underneath that. Data science and cybersecurity straddle alongside software development, and the terms can sometimes overlap. This doesn’t even scratch the many related domains of sound engineering, web design, graphic design, CAD, UX, and all the other inter-related disciplines of engineering that passively mix in. To make it even more entertaining, each of these have uniquely different cultures that derive from their approach.
- Very frequently, the tech industry is disproportionately loaded with mid/high-functioning ASD (autism spectrum disorder). One of the idiosyncrasies of ASD is an absolute obsession (and often mastery) of certain domains, with a complete disregard for related domains. They’ll frequently explore precisely what they like (e.g., app development) and disregard everything else (e.g., OS development), and it can make the cultures even more dissonant.
Pigeonholed
The granular nature of tech industries creates a risk for our brains. We tend to learn much more by connecting two bits of unrelated information than from related information. Learning a little bit about each protocol, for example, will be more broadly useful to understand networking than learning every little detail about one specific protocol. However, that broad understanding isn’t always part of the short-term, granular skills that make more money.
It’s not as productive in the short term, so learning little bits of everything isn’t generally as fashionable as learning specialized sub-sub-niche information:
- Learning is hard, but copying is easy, and most of the modern educational systems (at least in the West) promote rote memorization over understanding. It’s easier to gather and regurgitate trivia than distill it into something else.
- The tech experts have some discussion about “soft skills” (e.g., customer service, teamwork), but most of the business world seems to treat workers like a fixed cog instead of an adaptable memory metal. Soft skills are the career aspect of that memory metal.
- If someone is learning concepts and ideas, they’re not necessarily “making” something. Craftsmanship requires both hands-on skills and cerebral work.
- “Tribal knowledge” is the broad list of endless small details that nobody will ever bother to document. It’s a big part of why we can’t easily automate most labor, and why skilled professionals can often philosophically extrapolate across domains more easily than unskilled people can become skilled.
- Well-designed technology gives the luxury of not having to understand how things work.
Remixes
As living organisms, all our creations are simply remixes of everything else. We draw from the environment and make something new with it, which is often more elaborate, and is frequently more useful.
This moves itself across domains. Some of the most brilliant hacks come from the combination of unrelated domains:
- The need for people to take a one-way trip and peoples’ desire to work a side job
- Mixing the need for space-heating and the extra heat from mining crypto on a basement server
- The desire to run a website off a Raspberry Pi
Elegant
In a technical sense, brilliant ideas aren’t as clever as they first appear. Someone merely spliced two seemingly unrelated things into a more productive solution. There’s brilliance in what they did, but not in how. They were simply educated about 2 unrelated things at once, then saw patterns they could exploit.
Primitives are elegant and simple. A few tech primitives, for example:
- Machine code is purely logic-based, compounded into 2-based numbers, so everything in computers is gradations of black-and-white true/false (excepting quantum computing).
- All automation (such as programming functions) are a technical abstraction of how habits work, which mean it runs the same abstracted cycle of formation and adaptation. It also eventually creates catastrophe when the anticipated triggers or inputs deviate too strongly from actual triggers or inputs.
Our time on this planet is limited, so we must be choosy about the information we gather. Some information, such as every network protocol, is trivia, and much of it may be outmoded in a decade or two. However, other information (e.g., web standards, audio transfer considerations) are nearly timeless.
Hard
Tech primitives, however, are often difficult to learn because understanding them requires rewiring your thinking. For example, learning base-2 math instead of base-10 is not typically intuitive, but saves lots of time when figuring out memory allocation or IP addresses.
Nobody wants to hear about the dull sausage-making of success as much as the glamorous get-absurdly-rich tech entrepreneur story. The myths we wish to believe imply a brilliant and fashionable solution can be the resolute source of all our answers, so the hard work required with technology typically receives a hard pass. If there aren’t enough repetitive tasks to perform, even automation can be more work than the labor itself.
Several factors come together to make technology primitives especially hard compared to learning most other things:
- Computers are completely analytical machines which only interpret logic. They’re highly consistent, but when they fail, they aren’t good at debugging themselves. They always do what they’re told, and act on purposes with absolutely no meaning associated.
- Humans, the users who use those analytical machines, are feeling and dynamic living organisms. They’re radically inconsistent, but when they fail they’re exceptional at error correction, in themselves and others. They often don’t do what they’re told, and sometimes say they did even when they obviously didn’t, in conformity to their ever-shifting purposes that conform to a comparatively less nebulous sense of meaning.
- When humans are exposed to computers, they tend to anthropomorphize the experience. The computer will be filtered toward having an aesthetic personality based on the user’s bias, and the user’s preferences will reflect in the computer and dictate how that person feels about the computer. The computer doesn’t care.
Anyone in the tech industry is torn between two worlds: an isolated experience of being around the world’s fastest and most perfect idiots, but the rest of the world is an interpersonal experience with feelings and vagueness.
Universals
While most things in computers change around, some things don’t. If you understand those things, the endless slew of over-information that destroys our sanity won’t sweep you up with it. But, this isn’t easy, so it’s not frequently popular.
But, if you’re endlessly curious, you’ll enjoy the hard work. If you’re not, save yourself the misery and avoid that work.
Working to understand the base components of computers will not pad your resume. However, it pays for itself over time:
- A 10,000-foot view of what you’re working on makes learning easier.
- You can sidestep fashions that will render themselves obsolete within 5 years and avoid wasting time.
- A simpler view of everything permits you to see absurdly simple solutions for complicated problems, which will make you look like some sort of genius.
- All this saved time will mean you’ll live a more complete, fulfilling life on the things that you want to do.
Of course, it’s not the rush of endless stimulation that comes standard with the tech industry, but everyone eventually gets too old for that way of life.