Leveling up: why developers need to be able to identify technologies with staying power (and how to do it)

JavaScript fatigue has become a common phrase in the world of today’s front-end developers. It can seem like there’s a new hyped framework, architecture, command line tool, or SaaS developer service every day. The constant churn of new things can end up leaving developers more jaded than excited.

To avoid this, it’s important to build up a solid instinct for separating the technologies and products worth spending time on from the ones that will fade into obscurity after their 15 minutes of fame is over, their featured article on TechCrunch has faded to the archives, or the last passive aggressive comment on their “Show HN” is long forgotten.

My journey as a programmer started almost 30 years ago when I got my first computer: a used Commodore 64 that would greet me with a blinking cursor as an entry into “Basic V2”.

Since then, the only constant in the world of development has been change, and the need to always be learning and discovering. Here are some thoughts on how I’ve been able to keep up along the way without drowning in the constant flow of novelties.

Learn your history
This might be a surprising bit of advice in an article about getting ahead of the pace of change, but to understand and evaluate contemporary technologies, you have to learn about the history of your field.

In a space that changes this much and this often, its easy to take for granted that the stream of releases are truly new. But technology tends to be surprisingly cyclical; what might seem new on the surface, tends to have deep historical roots below.

When Ruby on Rails came out in 2004 it had an almost-meteoric rise and an immense influence on the industry. At the same time, most of the ideas underlying the Model View Controller (MVC) pattern it was based on, as well as the foundational object orientation patterns from Ruby, went all the way back to the Small Talk programming environment from the late 70’s.

For developers who were fluent with the major web platforms at the time (PHP, Java, ASP), Ruby on Rails introduced not just a whole new language with a new syntax, but new concepts and a major new paradigm for meta programming. However, for developers that had followed the rise (and fall) of SmallTalk and the languages and platforms inspired by it, Ruby on Rails was full of familiar concepts (with a bit of new syntax and some adaptation from the world of Small Talk applications crafted unto the web). All they needed to learn was the (important, but not huge) differences between Ruby and Small Talk, and the conceptual differences between MVC for the web and MVC for a Small Talk application.

In a similar way, when React came out it seemed to instantly sweep aside a whole generation of JavaScript frameworks. Most of these had tried to transfer a Rails-inspired MVC model to the browser. To many developers it seemed to be a drastic departure from both the single page app frameworks relying on templates with two-way data bindings, and from the simpler libraries like jQuery. But at its core, React was inspired by ideas from functional programming languages (especially OCAML) that went all the way back to the early days of computing.

The creator of React, Jordan Walke, recently described how his own journey back in history gave him the background needed to build out React:

For many front-end developers the journey into the more fully-fledged world of full-on state management in React with some form of “Flux” architecture like Redux, maybe combined with Immutable.js, can feel overwhelming. But for developers with a solid historical foundation who had been following the re-emergence of functional programming — and the concepts around it going back to the creation of LISP in 1958 — React reflects familiar concepts and ideas.

Even when actively trying to learn a new technology, history can be a helpful teacher. When Rails was first released, it was tough to come by material about it aside from a few online docs, tutorials, and the source code itself (more about source code later). However, a lot was written about the evolution of MVC through Small Talk to Objective C, and lots of lessons learned from working with meta programming and OOP based on message passing in the Small Talk world.

This can be a great tool for learning new technologies much faster: instead of reading the latest tutorials and the emerging documentation, figure out what they’re inspired by, what previous knowledge they draw on and build upon. Most likely the material about those older technologies, ideas and methodologies will be much more mature and you’ll find lots of lessons learned that most likely apply to the new take on the field.

A solid historical awareness gives you a really good toolset to ask the question: what is different this time? The answer (or lack of one!) to that question will very often determine the success or failure of a new technology.

People, culture, and community matter
It’s easy to think that tools and technologies are simply evolving on their own. For example, Object Oriented Programming became Functional Programming, text editors developed into full fledged IDEs, and dynamic languages transitioned into statically typed languages. However, new technologies and frameworks don’t just follow an evolutionary path on their own. They’re invented, built, and disseminated by humans, organizations, and communities.

When a new tool or technology emerges, it’s important to question both the technical underpinnings (How is it different? What underlying patterns does it build on?) and motivation (Why did someone choose to build this now? Who are the people that feel passionate about this? What problems does this technology solve for organizations?).

One of my favorite essays on why some tools win while others fade away is Richard P. Gabriel’s “The Rise of Worse is Better” from 1989. It describes a possible reason why Unix and C overtook the LISP-based technologies — a reason that had nothing to do with the inherent qualities of the two solutions.

In the essay Gabriel describes a “worse-is-better” approach, the New Jersey school of design in contrast to the MIT/Stanford school, that weighs the simplicity of the implementation higher than the simplicity or correctness of the end-user interface. This focus allowed C and Unix to beat LISP in the market. C compilers were easier to implement, port and optimize than LISP compilers and this made it much faster for the Unix implementers to get software into the hands of the users. This lead to faster adoption and eventually meant that far more people (and companies) were invested in growing and improving the C/Unix ecosystem.

When viewing new technologies, understand not just what they aim to do, and how they are technically implemented, but also how they are going to spread and how they will grow a community. Often the technologies that become important to the mainstream programming community are those that have the best answers to those later questions, even in cases where they can seem like a step back on pure technology grounds.

But here’s the real trick: sometimes tools that are technologically way ahead of the curve are doomed to never get widespread adoption (I’m willing to bet a lot of marbles that we’ll not all be writing web-apps in the Idris language anytime soon). LISP never became mainstream, but so many of todays mainstream frameworks, languages, libraries and techniques owe a huge debt to the ideas it invented and explored, and even today learning LISP can bring lots of insight into future technologies.

If you can spot the tools that live in this intersection, then learning those might bring you your next developer super-power.

Always Understand the “Why”
Back when I started developing, the closest thing to StackOverflow was computer magazines with source code you could manually type into your terminal to get the programs running.

I’m a sloppy typer, and I could never manage to type in a complete program without errors along the way. This is actually one of the (admittedly very few!) advantages of computer program printouts versus copy and paste-able Stack Overflow snippets: to get it to work, you need to actually understand the code.

As developers we’re always working with looming deadlines and with a pressure to get new functionality, features, and bug fixes out in the hands of our users as fast as possible. I’ve seen developers that get so focused on getting something out there, they throw libraries and code snippets together without taking the time to understanding why it works. Or, they’ll see that something is broken and simply try different potential solutions without first taking the time to understand why the system broke in the first place.

Don’t be that developer. Make it a rule for yourself to never use a solution from Stack Overflow or elsewhere before you take the time to understand why that solution could work. Challenge yourself to go one step further and figure out what it would have taken for you to come up with that solution yourself.

Sometimes you’ll find an issue where a small change (maybe changing one library for another, calling a variation of the function you were using, etc.) solves a bug, but you don’t actually know why. Don’t settle at this point. Dig in and build up a mental model that lets you understand exactly why one solution failed and another worked. Very often this will lead to deeper learnings and you’ll discover patterns that might reveal undetected bugs lurking in other parts of your system.

Also approach new technologies in this way. Don’t focus on learning on the surface. Learning the syntax for a few different frameworks or languages won’t teach you much, but learning the decision making process below the surface of those technologies will fundamentally make you a better developer.

When all is said and done, the most important thing is not what you learn (which framework, which tool, which language), but what you learn from it.

Putting these lessons to work

Choosing the right tools isn’t always easy or obvious — even for the most prolific of programmers. There’s a constant trade-off between sticking to well known, trusted and reliable tools with few surprises, and adopting brand new technologies that can help solve problems in new and better ways. But, a little up front work can make successfully choosing and implementing new tools part of your development practice. Indeed, it is a practice, one that’s always evolving. Here are a few ways to apply the suggestions from this post.

Learn your history
Historical awareness provides a solid toolset to ask, “What is different this time?” The answer (or lack of one) often determines the success or failure of a new technology. New stuff is cool. New stuff is fun. But if you feel overwhelmed at the speed of it all and the occasional burst of JavaScript fatigue is kicking in, then slow down and remember that it’s a long game and that following the large trends is more important than constantly rushing to rewrite all your apps in the newest framework. Peter Norvig puts it great in his essay “Teach Yourself Programming in Ten Years”.

People, culture and community matter
Thanks to the meteoric rise of GitHub, Stack Overflow and NPM it’s a lot easier to get early insight into how a community will scale and how developers are responding to its ambitions. While contributors and stars can tell you a lot about projects that are already successful, they aren’t always great early indicators of success. However, you can use the same logic to help determine whether a project is likely be embraced by the community as you might already use to build your own software or choose which company you want to work for:

Always understand the “why” behind a technology
Don’t focus on the surface, but on the currents underneath. Learning the syntax for a few different frameworks or languages will get you by, but learning the decision-making process of those technologies will fundamentally make you a better developer.

Michael Feathers has a great list of “10 Papers Every Developer Should Read”. All are about foundational ideas on languages, architectures and culture and set a great baseline for understanding the ideas beneath so many of the trends that are still making waves in programming today.

Go forth and dive into all the new things! But do it at a pace that makes sense. A pace that gives you time to build the right kind of foundation. This eventually lets you adopt new technologies faster, understand them more deeply, and evaluate their staying power more thoroughly.


Leveling up: why developers need to be able to identify technologies with staying power (and how to… was originally published in Netlify on Medium, where people are continuing the conversation by highlighting and responding to this story.