Tech Trends Suck

Firstly, I’m not saying technology itself is bad. Among other things, technology empowers us to perform things that were considered miracles a few short decades ago:

The tech industry is filled with young people. While there are statistical age outliers of older people, the industry leans toward people in their 20’s. and there are a few reasons for this.

A. By their very nature, young people have a few demographic features:

  • They’re often more concerned about money than necessarily a career, simply as a product of maturity, which means they’re expendable.
  • They have less life experience, so they’re easier to exploit, often have a more positive attitude, and are more likely to be loyal to their company.

B. New technologies are constantly getting developed, used, and outmoded. Trends that would normally take 10-20 years in the rest of the world will move at about 2-5 years inside tech, for several reasons:

  • Because of the abstraction-based nature of computer technology, most software is fire-and-forget once it runs well, so an established technology becomes the starting point for other things.
  • See A. Young people like novelty for novelty’s sake.

So, because of this, the industry is and always will run on cheap, naive, ambitious labor.

Neophiliac Culture

Anything dominated by young people that’s validated by prior success is almost guaranteed to push heavily against proven things that work. The founding of the USA is a similar story, which proved that a constitutional republic can work (at least for a few hundred years).

The history of Silicon Valley is a standard story of human nature pushing against limits. Shockley Semiconductor Laboratory was created in 1955 by William Shockley. Only two years later, the 8 leading scientists there (dubbed the traitorous eight) decided to create Fairchild Semiconductor (a division of Fairchild Camera and Instrument) in Mountain View, California. Their efforts, combined with Bell Labs and a few other large players in the industry, created the Silicon Valley culture that’s still present in most tech culture across the world.

Since Silicon Valley’s founding, novelty has reigned over computer technology, mostly because of how exponentially scalable new technology becomes. Between 1985 and 2000, for example, a standard personal computer was thousands of times faster, and was able to support many other functions like a nicer GUI and better games.

Shortly after their development, the philosophy of computers adopted a distinctly post-modern angle, with the idea of human-computer synthesis creating an altogether new way of approaching life. Their aspirations were somewhat correct, and somewhat grandiose. The ideas can get somewhat lofty, and tend to borrow heavily from science fiction.

  • Vannevar Bush’s 1960 paper called “Man-Computer Symbiosis” implies that people will universally think and behave more intelligently because of computers. That hasn’t happened, and we now have an over-abundance of information (and must learn to fight it).
  • People anticipated robots and AI would take over all the jobs. That hasn’t happened yet, and probably never will.
  • Some people speculated that the internet would remove ignorance and hatred. That, also, hasn’t happened yet, and it’s possible it’s made it worse.

Against most criticism, their attitude is mostly “we haven’t gotten there, but we will soon!” and most of them are die-hard advocates of what I call the Idiot Ancestor Theory (i.e., the people who came before us were morons).

This new-is-better philosophy hasn’t really changed, either. In 2013, Mark Zuckerberg was famously quoted as saying the motto of Facebook was “move fast and break things”. While he walked it back a year later, the attitude still permeates the culture.

One of the dominant reasons for this is the natural design of the computer versus how nature works:

  • Nature is inherently messy, with obscure redundancies everywhere, hidden features, additional components, and endless permutations.
  • Computers, by design, are built upon logic, which is always perfectly ordered, often well-organized, clean, conspicuous when problems arise, and resistant to arcane changes.
  • For most computer-based work, they must fight back against the randomness of nature with things like error-correcting code, and only specific implementations of the chaos of nature have any advantageous use (e.g., AI).


The downside of all the neophilia is that proven practices and sturdy, reliable systems can often be overlooked:

  • COBOL is a very fast programming language, though it has other downsides that make it unwieldy for most uses. As of 2020, around 80% of financial transactions are run on it, even though it was made in 1959.
  • RSS is a reliable, decentralized, free protocol for sending intermittently updated feeds of public information. It’s also not on the radar of many tech people because it was released in 1999.
  • There have been efforts to transform all interfaces to GUI-based, and even to VR-based, but nothing will ever fully replace the simplicity and straightforward nature of text messages and entering commands by typing.

Since nothing is tested, highly influential speakers can grab the industry’s attention, even when they have no credibility to back up what they’re saying or when what they’re saying is absolutely impossible. If it doesn’t draw someone in from hope, it’ll draw them in with fear, since evidence would take too long to acquire.

Some of these influencers are literally on drugs. From Atari’s famous games onward through executives that run current tech companies, the inspiration for the Next Big Thing often comes through copious amounts of substances that disconnect our ability to think practically. Most people don’t know about it because there’s very little incentive for anyone to talk about it, for a variety of reasons.

This “newer is better” attitude misses a key detail as well: non-tech people are more naturally resistant to change:

  • People were very quick to adopt the keyboard and mouse. While a touchscreen is a logical step forward, people will keep using the keyboard and mouse a long time after I write this in the 2020’s, and a VR revolution won’t get rid of a flat display screen for a long time after that.
  • People still use printers, and most tech people imagined they’d be gone by now.
  • “Smart” cars are unwieldy and awkward to use. One of the most egregious UX failures is the removal of control knobs in lieu of buttons on a touchscreen running on a proprietary stripped-down garbage-quality OS, and there’s often a way to downgrade your vehicle to include them again.

This neophilia isn’t new to computers, either. Since the 1920’s, people have wrongly forecast humanity would be rendered obsolete by the rise of horseless carriages, mass production, touch-button panels, and robots.

It’s perfectly reasonable to assume that AI or self-driving cars will yield a similar result to the grand trends of yesteryear, and that the future of DNA programming will be much of the same.

Long-Term Downsides

A best=new attitude creates constant long-term implications. Very frequently, developers are reinventing the wheel. It’s not uncommon to hear this pattern in descriptions of new technologies on GitHub and Product Hunt:

  • “[Older Technology], but faster.”
  • “Like [Older Technology], but has [Newer Technology Feature] in it.”
  • “A simpler solution for [Problem With An Existing Known-Good Solution].”

Those technologies often are better, but developers often perform tons of rework without considering the best use of their time. Most of their motivation is to become the Next Big Thing, but if they had about a decade of life experience they’d see the statistical unlikelihood of that endeavor and plan accordingly.

The age-old axioms of “Slow is steady, steady is smooth, smooth is fast” and Chesterton’s Fence (don’t remove things without understanding why they were there) have limited cultural relevance in the tech world. As a result, the flow of potentially useless information inside tech blogs and tech guides is worse than the stock market, and things frequently break because someone didn’t think it was worth testing extensively before shipping.

The irony of this is that many technologies do have utility, but most often long after everyone stopped talking about it. Tape drives, for example, are still a perfectly valid way to store lots of information when you don’t care too much about some of it getting lost (e.g., farm data).

This also means people will spend exorbitantly for the latest/greatest/newest/shiniest products, which will have a shelf life of approximately 6-12 months before they have to buy more. It may be worth the money for people inside the industry, but for everyone else it’s an absurdly expensive hobby.


Obsession with novelty, along with perpetually working intimately with computers, can distort a person’s view of reality.

Computers are a unique world unto themselves:

  1. Updating computers simply requires running pre-made software that the developer tested already, the code is effectively identical to what the developer ran, and is frequently very simple or invisible to the user. Most updates are presumably good.
  2. A GUI can look obsessively neat and tidy, so everything can satisfy the obsessive preferences of a computer user. If you don’t like the color of something, that’s often a setting or line of code away from changing, and it’s a rewarding experience to explore it.
  3. Everything in a computer is logic-based. If something breaks, there’s always a logical reason for it, and the code/hardware has a predictable answer if you look hard enough. Reading documentation is geeky and technical, but effective.
  4. Every aspect of a computer is clean-cut. Language is clearly articulated, computerized physics are simplified reproductions of reality, and distortions of perception are overlaid on top of the absolute information the computer already understands.
  5. Everything that’s “default” can be changed with the right programming.

By contrast, reality is messy:

  1. Updating something isn’t always easy. The very act of updating something is a violation of previously formed habits, and the changes are often as destructive as they are helpful. You often can’t trust that an update is from a trusted source.
  2. Obsessively organizing and managing life is almost more trouble than it’s worth. There are always sporks, and it takes lots of time to establish and maintain an organization system. Even then, you might not have room or resources to keep everything pristinely categorized.
  3. Everything in life is perception-based. When things break, that’s often only a matter of perception. Even the atomized form of reality is bound up in uncertainty, and the primitives of perception itself are bound together with sentiment. We have no manual beyond whatever religion we use, and the various types of documentation frequently contradict each other.
  4. The physics and sociology that tie to absurdly mundane things (such as boiling water or having a conversation about the weather) are vastly more complicated than most people realize, so predicting precisely is far harder than it sounds.
  5. Some things are programmed automatically, and can’t be redefined. Death and taxes, for example.

This can create remarkable delusions when tech-minded people try expanding their worldview into the space beyond their computers, especially when Silicon Valley’s is heavily subjected to the Cupertino Effect.

One clear consequence of all this is that most tech people are politically progressive.

  1. Naturally, if the old-fashioned way of things is inherently inferior, there’s no reason that a simple modern solution can’t fix what everyone’s been complicating for thousands of years.
  2. And, more importantly, that would mean any new political solution we haven’t tried yet can work to fix humanity, as demonstrated by the models.

Most of them miss the fact that nothing whatsoever under the sun can technically fix the human condition. People will use technology to make life easier in a general sense, but some of them will use that same technology to cheat, break laws, violate the boundaries of other people, and kill them. Alfred Nobel’s and Albert Einstein asserted that dynamite and atomic weapons would end all war, respectively, and the view that new technology will bring in new morality is just as misguided.

Most prominently, the fields of AI and VR get the most delusional set of expectations attached to it. AI is an attempt to create life, and VR is an attempt to create creation itself. A well-trained human-like machine learning algorithm will have all the defects of humanity, and a complete virtual world will have all the defects of the world we presently live in.


Since the industry works with information, its gatekeepers are the most powerful information-brokers on the planet. Information is the key to understanding anything, so the manager of information technology is the de facto gatekeeper to understanding. Every CEO of a large tech company has more knowledge power than the collective entirety of Ancient Rome, and far more efficiently without the breakdown in information transfer from before messages were sent electrically.

In practice, there are only several classes of individuals and groups who maintain all the power:

  1. Corporate executives who approve gigantic and revolutionary projects, with the intent to make lots of money through being the pioneer of an industry.
  2. Large corporations who capitalize on established, reliable technologies, with the intent to make lots of money through mass-produced distribution of those technologies.
  3. Small, individual developers who are lucky enough to become #1 or #2, with their own political agendas and management style changing as they ascend to power.
  4. Independent open-source developers who are typically too geeky to climb to social power, and typically value complete software freedom.

As long as young people obsess about trends, they’ll likely never see those power dynamics at play, and the cycle of power changes will repeat endlessly.

The only redemption to tech is that the trends move so fast that no singular corporation can theoretically corner the market, since their technology will become outdated as soon as they blink. It also helps that there’s enough pressure by the younger generation to advocate for open-source code, meaning those large entities must constantly shed at least some of their power to the masses to avoid all the smart kids going somewhere else.

Like any popular form of power, the technologies of today will probably grow until people feel threatened by it, then other forms of power (like governments) will subdue it and regulate it, and everyone will move their attention to yet another technology that gives power like what we see with the internet or AI right now.


Typically, an open-source implementation arises once a company fails spectacularly at providing all the features and conveniences they pioneered. Some people in the industry want a free, open society and have severe trust issues with the powerful movers and shakers of the technologies. They range from libertarians to communists, but have a shared hatred of centralized control under the organizations presently running those systems.

These vigilante-style programmers tend to find solace in passion projects directed at things like open-source OS development (e.g., GNU/Linux) that work directly against the interests of FAANG. They tend to simply build a free version of what already exists, but continually give power to the public if they’re willing to read the documentation.

Their innovation is often a response to the power plays (e.g., making a video hosting alternative when the primary hosting solution becomes Orwellian), so FLOSS tends to follow after FAANG paved the way in a for-profit way. Their actions are, therefore, never typically on the bleeding edge of the trends, but can sometimes come months afterward. Also, because of the complexities of intellectual property, sometimes companies will screw up and accidentally release something open-source they didn’t expect would be so popular!

Many in the open-source community imagine closed-source will be overtaken by open-source (e.g., Facebook made React), but that reasoning doesn’t resonate with reality. People like to own things when they can profit off it, and people still find a type of profit in open-source through free marketing and free debugging.

After all, young people are willing to volunteer for a cause they believe in, even when it’s silly.

Further Reading

Awesome Falsehoods Programmers Believe in