Illustration courtesy Matt McLaughlin.
It is 9:18 AM on August 21, 2021. You have just finished eating your space-breakfast, and you’re ready to get back to work maintaining the web presence for Omni Consumer Products. After about an hour, you find your latest change fails an acceptance test. It turns out to be a bug in “RedactSelect”, an open source “multiselect” web component you’ve been using. Looks like it hasn’t been updated in years, owing largely to its maturity and stability. “No problem,” you think, “I’ll just fix the bug and fork it.”
…except the source code looks a little strange. It’s using the @
symbol in
a way you’re not familiar with–almost like it was a private field. That can’t
be, though, because private fields in ES2020 are denoted with a #
. You check
the project’s build process, and sure enough, it is built using a long-outdated
“transpiler.”
You spend the next few hours making advanced web queries like, “babel 6.17.0 private field syntax”. Once you’re getting a handle on those semantics, you stumble across a bug report for the transpiler: at that version, it was outputting subtly-buggy code under certain conditions. Unfortunately, the fix didn’t come until the next major release which happens to be incompatible with “RedactSelect.”
It’s now 7:03 PM, and you still haven’t fed your robot dog or taken out the cyber garbage (much less completed the feature you were working on). You decide to cut your losses and just find a new component in the morning. The next few months are punctuated by bug reports for integration issues with the replacement.
Inspired by the Future
Now, you may be expecting me to request that you get off of my lawn. While I admit to being more conservative than some of my peers when it comes to new language features, I am by no means a curmudgeon! I’ve worked hard to extendTest262 (the official test suite for the ECMAScript standard) with the latest features, and I’ve collaborated with members of TC-39 (the standards body that shapes the JavaScript language) on the design of still more. I think the committee’s new yearly release schedule and tiered acceptance process are amazing improvements over its historic approach to publishing.
As a proposal matures, it passes through a number of “stages”, each designed to help spec authors, platform implementors, and application developers collaborate. Here are the expectations for a given proposal as it advances:
- Stage 1: The committee expects to devote time to examining the problem space, solutions and cross-cutting concerns
- Stage 2: The committee expects the feature to be developed and eventually included in the standard
- Stage 3: The solution is complete and no further work is possible without implementation experience, significant usage and external feedback.
- Stage 4: The addition will be included in the soonest practical standard revision
My concern is that, as an industry, we have not internalized the distinctions between these stages.
The Babel project has made it incredibly easy to write
code using experimental JavaScript features. All it takes is the installation
of two modules, and you can be writing do
expressionsright alongside standard syntax like switch
statements and for..of
loops.
This amazingly low barrier has prompted many developers to adopt early-stage
features across the board–from their one-off
experimentsto their open source
libraries,
to the applications that drive their
businesses.
So while the committee may make recommendations about when and how to use new constructs, for many developers, the only question is, “Is a plugin available on npm?” I’m reminded of a recent JavaScript users group meeting here in Boston. The presenter asked, “does anyone know which features were introduced in ES2016?”
“Function decorators,” came a reply from the audience.
“Actually, that’s not part of ES2016. Even it’s inclusion in ES2017 is debatable.”
“Oh, 2016. That introduced destructuring assignment.”
“Not quite–destructuring binding was standardized in 2015.”
You might think I’m being a little academic here. Maybe it seems snooty of me to expect others to keep track of such technicalities… But downplaying the relevance of the proposals’ “stages” has two real dangers.
The Threat to the Ecosystem
The first (as described in the doomsaying at the onset of this post) is that, in our rush to build on top of an evolving platform, we fragment our infrastructure over time. I’ll point out the ironic value of the name “Babel”–a project whose widespread adoption has the potential to very trulyconfound the language.
This is not a new problem, though, and it’s something we’re already dealing
with today. Consider jQuery, a library deployed to
millions of websites. Up until June of
2016, it
included an implementation of Promise
that was not standards compliant. We
seem to be getting by just fine despite this.
When the same class of specification violation occurs at a language level, though, the effect is much more severe. It is much harder to debug (or even identify) issues that arise from the code’s syntax (the correctness of which we generally take for granted) than those that come from libraries we interact with.
(This, by the way, is part of the motivation for the futurehostile
option inthe JavaScript linter JSHint. The setting disallows the
creation of bindings that are globally defined in future versions of the
language. In addition to mitigating migration issues, it also encourages
projects to explicitly label polyfills. For instance: when a library-providedPromiseconstructor is imported as BPromise
, readers can build a much clearer
understanding of the surface area for bugs.)
The Threat to the Platform
TC-39 operates based on the consensus of its members–an inter-disciplinary group comprised not only of researchers and runtime implementors, but practitioners from organizations like the JS Foundation, Tilde, Bocoup, andShape Security. As a result, consensus is derived not just from some idealized design, but from the realities of the industry. Take for example the following dialog on the subject of modules fromlast month’s proceedings:
Dave Herman: Design constraints: – It needs to be possible to import named exports from CJS – [using the
require
function to load an ECMASCript module] needs to [return] synchronouslyJeff Morrison: Are these technical needs or ecosystem needs?
James Snell: These are ecosystem needs. Babel today can do these things. Those users will want to be able to not change their code. If we say that doesn’t work, we’re violating a concern.
This demonstrates how user expectations push the committee to make difficult decisions. The more eagerly we build and deploy systems on proposed extensions, the more difficult it becomes for standards bodies to amend the design. Remember: it’s not “done” until stage 4! In extreme cases, this could lead to final designs that include sub-optimal aspects informed by “web reality.” That’s not a theoretic concern, either. Already, the specification devotes an entire sectionto the various irregularities that came about in this way.
Lingua Franca
These threats are credible only to the extent that we collectively adopt early-stage proposals. If we, as an industry, take a more conservative tack, then we don’t need to worry.
We might respond by refusing to use any syntax that has not been formally ratified by ECMA. In other words, “We won’t use ES20XX features until ES20XX is published.” In this case, all code across all projects would be fully standard-compliant at all times, and we wouldn’t have to worry about fragmentation or curtailing the design process.
Even if I thought anyone would listen to such a recommendation, I wouldn’t endorse it. Implementation feedback is a crucial part of the design process, so we very much need experimentation. Runtimes like V8 and Spidermonkey shouldparse and execute experimental syntax (though behind a flag); transpilers like Babel and Traceur shouldtranslate experimental syntax; application developers should write code using experimental syntax. It’s our best hope at honing in on more of the kinds of beautiful abstractions that make JavaScript so enjoyable.
If we instead develop an awareness of each proposal’s current “stage” and exercise some sensitivity to that status, then we can participate in the advancement of ECMAScript in a way that is both effective and responsible. This requires some nuance, so we probably can’t define any hard-and-fast rules. I can make some general suggestions, though:
- Stage 2 and below: Reserve for personal experiments–not projects with
any dependents. Of course, it’s always safe to experiment on a
clearly-labeled “unstable” branch. Just know that larger projects may require
more refactoring in the event of change. (Remember that
Object.observe
advanced to this stage before ultimately being withdrawn.) Share your experiences on the es-discuss mailing list or on the proposal’s issue tracker. - Stage 3: Implement in non-critical production code. Your experience in a more realistic setting may uncover new wrinkles–share those immediately! Be cautious about using in larger projects because nothing is set in stone.
- Stage 4: Use as you wish. This proposal is effectively standardized; only formalities remain. Feedback is nice but no longer effective.
There’s definitely some room for “fudging” between these stages; being dogmatic isn’t going to serve anyone. However, this strategy does have one aspect that we should consider non-negotiable: feedback. Developers who experiment with early-stage proposals have a certain responsibility to engage in the process.
So get out there and start experimenting: bind some functions, decorate some methods, and cancel some Promises. Use these early experiments to satisfy your curiosity and provide feedback, but please think twice before building your next product with any features not yet standardized.
Comments
We moved off of Disqus for data privacy and consent concerns, and are currently searching for a new commenting tool.
I’ve stuck with stage-2, but try to cherry pick features that I’m very sure will move forward as it is… async/await, object rest/spread, etc… I’ve been pretty lucky, and understand the risks, but sometimes an experimental feature is that useful in terms of a cleaner codebase.
@Mike its an interesting topic what you covered here. To be honest I think that by 2020 most of the code what we write now will be destroyed or rewritten and I think this will be the case because of the speed that the JS world evolves.
Looking back to the ES6 feature plans and at the standard we can say it did not change much… I think once a new version of the ES is defined the ideas are taken from already working models in other programming languages (like async/await) where that feature is working and stable, so I hope that by embracing the new standards as early as possible we do not anchor our codebase to a version, but we rather future-proof our code.
> by 2020 most of the code what we write now will be destroyed or rewritten
That may possibly be true for you, but it certainly isn’t true for most developers out there.
Most of us are getting paid to write code because we produce something that has lasting value. Lasting value means that it is expected to still be running in five years time with as little maintenance as possible. (If it’s still running in ten years that would be seen as a bonus, although even the most backward looking manager might agree that’s a bit much. But five years is definitely not asking too much).
If I think back to the work I did five to ten years ago, I know for a fact that several of those systems are still running with my original code. I also know that at least one of them was originally intended as a proof of concept prototype and has a lot of bad code as a result. Not ideal, but management saw it, liked it, and wanted it immediately. That happens a *lot*, so you should never take it for granted that your prototype code won’t end up as a critical production system for the next decade.
> I hope that by embracing the new standards as early as possible we future-proof our code.
Hmm, that hope would seem to contradict your belief that your code will be thrown away by 2020. Future-proofing doesn’t mean \”it still works in six months\”, it means \”it still works in ten years\”.
And future-proofing doesn’t mean using as many new features as possible. It just needs you to avoid features that are deprecated or might be in the future.
I’ll finish by saying that the problems described in this article are not hypothetical, they’re real. My current employer has projects under way working with pretty much every tech stack you can think of… except Node.js. The reason why we aren’t using Node? Because of nervousness about long term stability of anything we write. We’ve done various prototypes with Node and each time we’ve picked what we thought was the best choices of libraries from NPM. Most of those libraries are now obsolete. When clients are spending 500k+ on a project, we won’t win much work if we suggest using a platform that will need to be rewritten from scratch a year later.
hi Spudley,
Thanks for your feedback and you do have valid points, but maybe I was misunderstood, because I missed a point in my comment and thank you for highlighting that!
As you mentioned, most of the NPM packages get obsolete pretty quickly and that is true. Actually this tendency of not supporting libraries on the long term is not specific to Node or JS, I think its more of result of the open source community approach…if a tool is really good and its used by many then it will get constant updates/upgrades otherwise it will get deprecated soon. This was the main idea behind my comment with software written today may be rewritten till/in 2020 because libraries which those depends on are not maintained and maintaining or fixing bugs in those is much more hassle then replacing them.
I agree with your other comment too, the right tool for the right job, if the target is a long term support of a product where robustness and stability is the most important then you are right, maybe node isn’t the best platform to start with or rely on, there are companies which have more than 500k value business based on Node, but your case might be other and this technology does not fit your needs, there is no problem with that…
Tried to explain the stages in a tweet: https://twitter.com/brunole…
Where do you think something like JSX fits into this?
I am definitely the \”get off my lawn\” developer when it comes to JavaScript and how it’s fragmented. I prefer to stick with non-es6 syntax and features because I can write code without transpilers for the browser. While say this, I remember a time when jQuery and I would make synchronous(blocking) ajax requests because I was fuzzy with how to structure asychronous code. You can’t do blocking ajax requests anymore, for good reason, which means that code I wrote back then won’t work like expected anymore. All of this may all change if webassembly ever comes to pass. If it does, will any flavor of JavaScript still be as hot then as it is now?