I built my first website in 1995. It was a simple affair consisting of markup I wrote with
notepad.exe. Deployment involved FTPing the files to GeoCities. The code is long gone, but that site would still render the same with Chrome in 2018 as it did with Netscape in 1995. Heck, even the first website ever still looks fine in a modern browser. (And it’s fast!)
This level of backwards compatibility doesn’t happen by accident. It’s not fun or exciting to carry along decades of technical debt. The way we build websites today is nothing like the way we built them 15 years ago, but fortunately the old stuff still works. I believe this is a large factor in the web’s success. People don’t give enough credit to backwards compatibility, and the benefits it unlocks.
Of Protocols and APIs
One of the best examples of backwards compatibility is Windows. The
Win32 API is an archaeological look at software history. Microsoft famously added special case logic so SimCity would work on Windows 95, even though it was freeing memory before it was allocated. And here’s a video of someone upgrading all the way from Windows 1.0 to Windows 10.0. That takes serious dedication.
But why is this important? For Windows at least, developers felt confident that the code they wrote would continue to work for the foreseeable future. This gave Windows a deep bench of useful software. Additionally, older-but-still-functional software could be deployed on new versions of Windows without much thought. There’s still a compatibility mode baked in to Windows 10 for very old code.
The web follows a similar path. The foundational protocol on which browsers communicate with web servers (HTTP) is being developed in a backwards compatible way. I can still visit websites served with HTTP 1.0 and see the content. Further, browsers can still render old versions of HTML. This gives the web a deep bench of old-but-still-valuable content. As a web developer, I can generally feel confident that the code I write today will continue to work tomorrow… or can I?
A Tale of Two Frameworks
When you run a software business, you’re more careful about the technology you choose because it’s your money on fire. Personally, I optimize for two things.
1. Time to market. When a feature is being built, I want it out the door and in customer’s hands as soon as possible. This means the technology employed must be easy to use and productive. I don’t want to deal with framework API churn or tooling issues.
2. Maintainability I don’t have the time or desire to constantly care and feed the features I build. Once they’re in the wild, I want them to run with minimal fuss, and be extensible and flexible when requirements change.
jQuery fits the bill. I can pick up code written three years ago and know exactly what it’s doing, and how to extend it. People often complain of “jQuery soup” and how hard big jQuery codebases are to maintain. This isn’t telling the whole story. Any large codebase will be unmaintainable if you code without discipline or structure. I’ve seen 6-month-old React apps that are more impenetrable (thanks higher-order-components) than the worst offending jQuery sites.
The Angular Treadmill of Despair
Another popular framework is Angular. Which Angular am I talking about? Angular 1 or 2? It’s confusing because even though the frameworks are both built by Google, they have little in common. Angular 1 is called AngularJS these days, and Angular 2+ is just Angular. They don’t call it Angular 2 anymore because Angular 2 is deprecated. The current stable release of Angular at the time of this writing is Angular 6.0.5 . The original release of Angular 2.0 was September 2016. So in 21 months Angular has gone up 4 major versions (yeah, I know they skipped 3.0, but that almost makes it worse). Per the Angular versioning documentation each major version is expected to introduce new APIs and break existing ones. A new major version release (and the breaking changes that come with) is expected every 6 months.
To make things more complicated, there’s no easy upgrade path from AngularJS to Angular. There’s no backwards compatibility between the two, and now every 6 months they could break it again. In fact, if you were using the
HttpModule to make HTTP requests in Angular 2-4, you better start using the
HttpClientModule instead, if you want to upgrade past Angular 5. The unit testing story is similar: SystemJS is no longer a supported path to load and run unit tests. Only Angular CLI is officially supported. This is problematic for any hybrid AngularJS/Angular application as CLI does not play well in that case.
Developer Fatigue and Money on Fire
My main point is that modern web frameworks aren’t taking backwards compatibility seriously. Heck, most of the packages on NPM are similar. Breaking changes are constant. Framework and library authors break backwards compatibility on the altar of semantic versioning saying, “hey, we bumped the major version!” as if that’s all that matters. Developers get tired of fighting upgrade battles every few months. It also costs money. Every hour spent fighting a breaking change is an hour your software isn’t moving forward. Frameworks and libraries should help developers do their jobs, not cause friction or churn.
At TrackJS we take backwards compatibility seriously. We’ve still got customers using our v1 script agent, sending data to our v1 endpoint, which was developed almost 6 years ago. We have no plans to force those customers to move. They are still supported. If they wanted to upgrade, it’s just a drop in replacement of the new script agent. It’s fully backwards compatible. That doesn’t mean we haven’t iterated, or improved things in the meantime, we just recognize the cost of supporting our older code as part of doing business.