431 points by onlyspaceghost 18 hours ago | 52 comments
auxiliarymoose 16 hours ago
I really think writing dependency-free JavaScript is the way to go nowadays. The standard library in JS/CSS is great. So are static analysis (TypeScript can check JSDoc), imports (ES modules), UI (web components), etc.

People keep telling me the approach I am taking won't scale or will be hard to maintain, yet my experience has been that things stay simple and easy to change in a way I haven't experienced in dependency-heavy projects.

Joeri 12 hours ago
I’ve been exploring this for years, even made a tutorial website about building sites and apps without dependencies (plainvanillaweb.com). What I’ve learned is that many of the things the frameworks, libraries and build tools do can be replaced by browser built-ins and vanilla patterns, but also that making things that way is at present an obscure domain of knowledge.

I think this is because the whole web dev knowledge ecosystem of youtubers and tutorial platforms is oriented around big frameworks and big tooling. People think it is much harder than it actually is to build without frameworks or build tools, or that the resulting web app will perform much worse than it actually will. A typical react codebase ported to a fully vanilla codebase ends up just as modular and around 1.5x the number of lines of code, and is tiny in total footprint due to the lack of dependencies so typically performs well.

To be clear though: I’m not arguing the dependencies are bad or don’t have any benefits at all or that vanilla coding is a superior way. Coding this way takes longer and the resulting codebase has more lines of code, and web components are “uglier” than framework components. What I’m saying is that most web developers are trapped in a mindset that these dependencies must be used when in reality they are optional and not always the best choice.

auxiliarymoose 11 hours ago
Thanks for creating and sharing that resource! I'm reading through it now, and it looks fantastic. I'll share it the next time someone asks where to get started with web dev.

Come to think of it, I should write up the techniques I use, too...e.g. I have simple wrappers around querySelector() and createElement() with a bit of TypeScript gymnastics in a JSDoc annotation to add intellisense + type checking for custom elements.

Would you be open to a pull request with a page on static analysis/type checking for vanilla JS? (intro to JSDoc, useful patterns for custom elements, etc.) If not, that's totally OK, but I figure it could be interesting to readers of the site.

And agreed on vanilla/dependency-free not being a silver bullet. There aren't really one-size-fits-all solutions in software, but I've found a vanilla approach (and then adding dependencies only if/when necessary) tends to help the software evolve in a natural way and stay simple where possible.

zahlman 3 hours ago
> To be clear though: I’m not arguing the dependencies are bad or don’t have any benefits at all or that vanilla coding is a superior way. Coding this way takes longer and the resulting codebase has more lines of code, and web components are “uglier” than framework components.

If you do it long enough, presumably you start to develop your own miniature "framework" (most likely really just some libraries, i.e. not inverting control and handing it over to your previous work). After all, it's still programming; JS isn't exceptional even if it has lots of quirks.

Anyway, love the website concept, just a quick petition: would it be possible to apply some progressive enhancement/graceful degradation love to <x-code-viewer> such that there's at least the basic preformatted text of the code displayed without JS?

JodieBenitez 4 hours ago
Such a great website ! Congrats.
j45 9 hours ago
Nice resource!

Depending on the use case, minimizing dependancies can also decrease attack vectors on the page/app.

Maxion 14 hours ago
Did this for a project in 2022. Haven't had any drama related to CVEs, hadn't had any issues related to migration from some version of something to another.

The client has not had to pay a cent for any sort of migration work.

jsmith99 9 hours ago
Is the lack of CVE because the implementations you wrote are better written and safer than those in the standard libraries or because no one has checked?
teaearlgraycold 1 hour ago
Well there's probably far less attack surface.
foldr 9 hours ago
Presumably the latter. However, mindlessly bumping package versions to fix bullshit security vulnerabilities is now industry standard practice. Once your client/company reaches a certain size, you will pretty much have to do it to satisfy the demands of some sort of security/compliance jarl.
consp 6 hours ago
And yet npm install [package with 1000 recursieve dependencies] is not considered a supply chain risk at all to those security/compliance jarls.

Let alone having to check all licenses...

auxiliarymoose 11 hours ago
There are certainly security benefits to keeping things in-house. Less exposure to supply-chain attacks (e.g. shai-hulud malware) and widespread security bugs (e.g. react server components server-side RCE). Plus it's much easier to do a complete audit and threat model of the application when you built and understand everything soup-to-nuts.

Of course, it also means you have to be cautious about problems that dependencies promise to solve (e.g. XSS), but at the same time, bringing in a bunch of third-party code isn't a substitute for fully understanding your own system.

zelphirkalt 7 hours ago
Very laudable, though this is probably also part of the issue: If the client doesn't need any migration work, the dev doesn't get more money, which in turn might be phrased: "It is difficult to get a man to understand something, when his salary depends upon his not understanding it!" -- by someone other than me.

I have worked at employer, where one could have done the frontend easily in a traditional server side templating language since most of the pages where static information anyway and very little interactive. But instead of doing that and have 1 person do that, making an easily accessible and standard-conforming frontend, they decided to go with nextjs and required 3 people fulltime to maintain this, including all the usual churn and burn of updating dependencies and changing the "router" and stuff. Porting a menu from one instance of the frontend to another frontend took 3 weeks. Fixing a menu display bug after I reported it took 2 or 3 months.

j45 9 hours ago
It's nice to sidestep the relative brittleness of web implementations simply because of versions.
bell-cot 11 hours ago
> The client has not had to pay a cent for ...

From human society's PoV, you sound like a 10X engineer and wonderful person.

But from the C-suite's PoV ...yeah. You might want to keep quite about this.

robocat 9 hours ago
Rendering components is the easy part. Another goal of frameworks is to provide the model (reactive updates): https://mjswensen.com/blog/the-single-most-important-factor-...

What do you use for model updates?

apsurd 2 hours ago
web components are reactive. They have a render pattern similar to React's render function. Granted, web-components are much more wonky than React, but the functionality is there.

It seems best practice to use the component's attributes directly. So the component is subscribed to its attributes change lifecycle and renders updates.

j45 9 hours ago
Looking into the history of reactive updates, we find that it started with simple javascript commands helped kickstart most of it.

https://en.wikipedia.org/wiki/Ajax_(programming)

The idea of reactivity started in the 1990's in production.

When Gmail was released this technology is what made a website behave like a desktop app (plus the huge amount of storage)

If we were to look into today's equivalent of doing this, it might be surprising what exists in the standard libraries.

Bockit 10 hours ago
I've been doing JS for nearly a couple decades now (both front and back) and I landed on the same approach a few years ago. Pick your absolutely minimal set of dependencies, and then just make what you need for everything else. Maybe counter-intuitive to some, I feel like I'm more comfortable maintaining a larger codebase with less people.

What's more, given the tools we have today, it fits really well with agentic engineering. It's even easier to create and understand a homegrown version of a dependency you may have used before.

CoderLuii 15 hours ago
been doing something similar. the projects ive been building recently use as few dependencies as possible and honestly the maintenance burden dropped significantly. when something breaks you actually know where to look instead of digging through 15 layers of node_modules. people said the same thing to me about it not scaling but the opposite turned out to be true.
auxiliarymoose 12 hours ago
yeah, plus stack traces, debuggers, and profiling tools are easier to use when all of the non-essential complexity is stripped out. which in turn means it's possible to work productively on software that solves more complex problems.

that's in contrast with the sort of stuff that invariably shows up when something falls over somewhere in a dependency:

    cannot access property "apply" of null
    at forEach()
    at setTimeout()
    at digest()
    at callback()
    at then()
    ...
it's not fun to step through or profile that sort of code either...
arcadianalpaca 5 hours ago
Yeah I've been doing this more and more. The friction of keeping dependencies updated has gotten worse than the friction of just maintaining the (often times) 20 or so lines you need to write yourself. Not to mention the pain you'll eventually find yourself when a bug isn't patched fast enough, or when you need to make a small change, or when a transitive dependency you've never heard of gets compromised...
assimpleaspossi 9 hours ago
CSS has a standard library? I stopped doing web dev just three years ago and am not aware of such a thing. Do you mean the CSS standard?
auxiliarymoose 1 hour ago
Sure it's not officially called the "standard library," more precisely it would be "the parts of the ECMAScript and CSS standards implemented by all popular evergreen browsers," but "standard library" expresses this in the way people usually talk about programming languages.
theandrewbailey 9 hours ago
It wouldn't surprise me if CSS has a standard library. It is Turing complete, after all.
obsidianbases1 6 hours ago
This is the way. Even more so now that LLMs can reliably write simple utilities, the kind of things a dependency would previously frag in hundreds of other utilities (that go unused) all while depending on another dozen dependencies
leptons 14 hours ago
If I need a library for nodejs, the first thing I do is search for the dependency-free option. If I can make that work, great.
anematode 14 hours ago
This is absolutely the way to go
k__ 10 hours ago
Doesn't this go against the credo of not building your own crypto?
auxiliarymoose 10 hours ago
No, it means using the crypto module in the standard library instead of importing some third party dependency.
embedding-shape 10 hours ago
Depends on what cryptography you're talking about, the Web Crypto API exists for quite some time, so I'd say that fits in (usually) with "The standard library in JS/CSS is great".
kigiri 30 minutes ago
I work on a ~9y old nodejs codebase, have none of those issues, we have 8 dependencies, this is fully resolved tree.

One to generate zip files, one for markdown parsing, connecting to postgres, etc... most of them have no sub dependencies.

We always reach out first to what nodejs lib have, try to glue ourself small specific piece of code when needed.

The app is very stable and we have very few frustrations I used to have before. Note that we used to have way more but bit by bit removed them.

Now I would whitelist anything from the deno std* lib, they did a great job with that, even if you don't use Deno, with what ever your runtime provide plus deno std you never need more than a few packages to build anything.

JS is doing pretty good if you are mindful about it.

stevoski 12 hours ago
Well-written article, manages not to sound rant-y while describing the problem well.

I feel like part of the blame for the situation is that JavaScript has always lacked a standard library which contains the "atomic architecture" style packages. (A standard library wouldn't solve everything, of course.)

josephg 9 hours ago
What functionality is still missing from the JS standard library? The JS standard library seems massive these days.

Edit: Removed a reference to node and bun.

Frotag 1 hour ago
Some utility stuff I copy paste between projects:

  - range, clamp, inIvl, enumerate, topK
  - groupBy (array to record), numeric / lexical array sorts
  - seeded rng
  - throttling
  - attachDragListener (like d3's mousedown -> mousemove -> mouseup)
  - Maps / Sets that accept non-primitive keys (ie custom hash)
So basically functions that every *dash variant includes.
Sharlin 9 hours ago
It's not standard unless it's in the actual standard.
simonw 9 hours ago
Those aren't a standard library for the language itself - they're not showing up in browsers, for example.
josephg 8 hours ago
Thanks but that doesn't answer my question. Forget node and bun then. What is missing from the standard library?
realityking 2 hours ago
There’s been a lot of progress (Temporal, URL, TextDecoder, Base64 encoding, etc.) but there are still gaps.

Math.clamp is a big one (it’s a TC39 proposal). I’d also love to have the stats functions that Python has (geometric mean, median, etc.).

On the more ambitious end: CSV reading/writing and IPv4/IPv6 manipulation.

simonw 8 hours ago
What standard library? Do you mean the built-in Array.x etc methods you get in the core language spec?
skydhash 8 hours ago
In the browser, Javascript’s role is to add interactivity to the web page, and the API has a good surface area (even if not really pretty). People talk about the lack of standard library, but they can never say what’s missing.

https://developer.mozilla.org/en-US/docs/Web/API

The above seems fairly expansive, even if we remove all the experimental ones in the list.

tshaddox 7 hours ago
I suppose we could quibble about what exactly “standard library” means, but I’m presuming we’re talking about the web (rather than, say, Node or Bun). And to me it’s fair to use it to refer to all web APIs that are widely available. Things like crypto, ArrayBuffer, TextEncoder, File and the File System Access API, Intl, the Streams API, Window.performance, etc.
throw-the-towel 6 hours ago
String.splitRight, for one. (As an example: "www.a.north.website.com".splitRight(".", 3) == ["www.a.north", "website", "com"].)
bakkoting 6 hours ago
Python and Rust have such a thing, but not e.g. Java, Go, C#. And I can't find any libraries on npm which do this. That seems like a very niche need, not actually the sort of thing whose absence causes people to have lots of npm dependencies.
fireflash38 5 hours ago
When would you want that when it wouldn't be covered by more domain-specific use cases?
eudamoniac 4 hours ago
Most things in Remeda, ramda, rxjs, the methods in the Ruby stdlib, etc. would all be great to have. I use at least Remeda in every project when I can.
jefftk 9 hours ago
We're talking about JS in browsers: many fewer options there, plus needing to support old devices.
couscouspie 11 hours ago
I like rants though. They help me understand, not only how people feel about stuff, but also why.
8 hours ago
andai 16 hours ago
Great article, but I think these are all marginal.

The main cause of bloat is not polyfills or atomic packages. The cause of bloat is bloat!

I love this quote by Antoine de Saint-Exupéry (author of the Little Prince):

"Perfection is achieved, not when there is nothing left to add, but nothing to take away."

Most software is not written like that. It's not asking "how can we make this more elegant?" It's asking "what's the easiest way to add more stuff?"

The answer is `npm i more-stuff`.

cwnyth 15 hours ago
Cf. Vonnegut's rule #4 of good writing:

> Every sentence must do one of two things—reveal character or advance the action.

Or Quintilian's praise of Demosthenes and Cicero: "To Demosthenes nothing can be added, but from Cicero nothing can be taken away."

Defenestresque 8 hours ago
For the curious, the rest of the types:

Use the time of a total stranger in such a way that he or she will not feel the time was wasted.

Give the reader at least one character he or she can root for.

Every character should want something, even if it is only a glass of water.

Every sentence must do one of two things—reveal character or advance the action.

Start as close to the end as possible.

Be a sadist. No matter how sweet and innocent your leading characters, make awful things happen to them—in order that the reader may see what they are made of.

Write to please just one person. If you open a window and make love to the world, so to speak, your story will get pneumonia.

Give your readers as much information as possible as soon as possible. To heck with suspense. Readers should have such complete understanding of what is going on, where and why, that they could finish the story themselves, should cockroaches eat the last few pages.

The greatest American short story writer of my generation was Flannery O'Connor (1925-1964). She broke practically every one of my rules but the first. Great writers tend to do that.

assimpleaspossi 6 hours ago
>>Write to please just one person. If you open a window and make love to the world, so to speak, your story will get pneumonia.

This, too, is the problem with movies and TV shows today. They worry so much about offending anyone they lose the interest of everyone. When was the last time you laughed hard and out loud?

thunky 7 hours ago
> Give the reader at least one character he or she can root for.

I've been noticing for a while now this is missing in most modern tv shows. It makes the show feel pointless.

cobbzilla 15 hours ago
Is there no room for describing the setting? Must every utterance that sets the atmosphere also advance the plot or reveal character? Is there no room for mood?
nkrisc 7 hours ago
What is the purpose of the setting if not to reveal character or advance the plot?

I don’t need to know the color of the walls if it does neither.

root_axis 6 hours ago
Framed that way you could characterize anything as ultimately serving the characters or plot.
nkrisc 5 hours ago
Not really. There are infinite insignificant details that could be included that should not be included because they do neither in any meaningful way.
alt187 9 hours ago
describing the setting should (ideally) be done through a character's interaction with the setting.

if you're developing some sort of dystopia where everyone is heavily medicated, better to show a character casually take the medication rather than describe it.

of course, that's not a rule set in stone. you can do whatever the fuck you want.

hombre_fatal 15 hours ago
> Is there no room for describing the setting? Is there no room for mood?

You mean the character of a place?

cobbzilla 13 hours ago
sure, setting and character are the same thing
bryanrasmussen 12 hours ago
the implication is that if mood is the character of the place then those sentences that set mood are advancing character.
josephg 9 hours ago
Some authors rarely describe a place objectively. We see a space through the eyes of the characters - and in doing so, we learn about our characters as we learn about the space they inhabit.
bryanrasmussen 6 hours ago
sure, if a character is in some narrative role; however I would argue that no author ever describes a place objectively, especially not a completely fictional place. The question really is if the unobjective description serves a coherent narrative purpose.
IsTom 11 hours ago
He's very efficient with prose and I find it a joy to read (well, given what he's writing about it's not always joy, but still). I'm not sure he's following that rule 100% of the time, but it's close. Depending on the setting, you can often describe it through characters' actions or how it shapes them.
righthand 9 hours ago
The “mood” should reflect the character not the author’s desire to detail out the room.
brigandish 11 hours ago
Setting would provide the context for action or characterisation to occur in a meaningful way, or provoke it, so it is necessary part of both (if done for either of those purposes). Given that, the charitable interpretation would be to only provide enough description of the setting for that.
sheept 13 hours ago
All software has bloat, but npm packages and web apps are notorious for it. Do you think it could be inherent to the language?

JavaScript seems to be unique in that you want your code to work in browsers of the past and future—so a lot of bloat could come from compatibility, as mentioned in the article—and it's a language for UIs, so a lot of bloat in apps and frameworks could come from support for accessibility, internationalization, mobile, etc.

guax 12 hours ago
The problem JS development is facing is the same most languages might go through. The "Magic" that solves all problems, frameworks and solutions that solve small issues at a great cost.

Lots of developers don't even say they are JS devs but React devs or something. This is normal given that the bandwidth and power of targets are so large nowadays. Software is like a gas, it will fill all the space you can give it since there is no reason to optimize anything if it runs ok.

I've spent countless hours optimising javascript and css to work across devices that were slow and outdated but still relevant (IE7, 8 and 9 were rough years). Cleverness breads in restrictive environments where you want to get the most out of it. Modern computers are so large that its hard for you to hit the walls when doing normal work.

socalgal2 12 hours ago
Every cargo install (rust) I've down downloads 300 to 700 packages

Every C++ app I install in linux requires 250 packages

Every python app I install and then pip install requirements uses 150 packages.

stingraycharles 10 hours ago
This is not true at all.
Tade0 10 hours ago
A while ago I started a game project in Rust using one of the popular engines.

10GB of build artifacts for the debug target.

embedding-shape 10 hours ago
You should give it a try to compile other game engines, and compare them, Unreal Engine is a fun one with the source available, take a look how big their artifacts are :)

With that said, there are plenty of small game engines out there, but couple Rust's somewhat slow compile times with the ecosystems preferences for "many crates" over "one big crate", and yeah, even medium-scale game engines like Bevy take a bunch of time and space to compile. But it is a whole game engine after all, maybe not representative of general development in the community.

andai 5 hours ago
In Rust land, I enjoyed Macroquad, for simple 2D stuff. It's very much in the vein of XNA/MonoGame.
IshKebab 10 hours ago
I wouldn't say every Rust app does, but I do think it has become more normal for Rust apps to have 200-600 dependencies. However when I look at the list, they usually all make sense, unlike with NPM. There are rarely any one-line crates. Actually I haven't seen any yet (except joke ones of course).

There's no way the average C++ app uses 250 packages though. It's usually more like 5. C++ packaging is a huge pain so people tend to use them only when absolutely necessary, and you get huge libraries like Boost primarily because of the packaging difficulty.

I would say Python varies but 150 sounds high. Something more like 50-100 is typical in my experience.

PaulDavisThe1st 3 hours ago
Ardour (an open-source x-platform digital audio workstation, written in C++) has on the order of 80 dependencies.
andai 5 hours ago
I think it has to do with temperament and incentives.

For example we often see posts on HN about, "see, it's possible to write very fast software in language foo!" And most of the time yes, especially on modern hardware, most languages do allow you to write surprisingly fast software!

It's just that the people who actually want their software to run fast -- and who want it enough to prioritize it against other, competing values -- those people will generally reach for other languages.

With JavaScript, the primary "value" is convenience. The web as a platform is chosen because it is convenient, both for the developer and the user. So it stands to reason that the developer will also make other choices in the name of convenience.

Of course, there's weirdos like me who take pride in shipping games JS that are eight kilobytes :) But there are not very many people like that.

lukan 12 hours ago
"Do you think it could be inherent to the language?"

Not to the language but its users. Not to bash them, but most of them did not study IT on a university, did not learn about the KISS principle etc.

They just followed some tutorials to hack together stuff, now automated via LLM's.

So in a way the cause is the language as it is so easy to use. And the ecosystem grew organically from users like this - and yes, the ecosystem is full of bloat.

(I think claude nowdays is a bit smarter, but when building standalone html files without agents, I remember having to always tell chatgpt to explicitely NOT pull in yet another libary, but use plain vanilla js for a standard task, which usually works better and cleaner with the same lines of code or maybe 2 or 3 more for most cases. The standard was to use libaries for every new functionality )

bigstrat2003 12 hours ago
> All software has bloat, but npm packages and web apps are notorious for it. Do you think it could be inherent to the language?

It sure seems like it is because JS devs, by and large, suck at programming. C has a pretty sparse standard library, but you don't see C programmers creating shared libraries to determine if a number is odd, or to add whitespace to a string.

qayxc 11 hours ago
> you don't see C programmers creating shared libraries to determine if a number is odd, or to add whitespace to a string.

Believe me, if C had a way to seamlessly share libraries across architectures, OSes, and compiler versions, something similar would have happened.

Instead you get a situation where every reasonably big modern C project starts by implementing their own version of string libraries, dynamic arrays, maps (aka dictionaries), etc. Not much different really.

zdc1 17 hours ago
A lot of this basically reads to me like hidden tech debt: people aren't updating their compilation targets to ESx, people aren't updating their packages, package authors aren't updating their implementations, etc.

Ancient browser support is a thing, but ES5 has been supported everywhere for like 13 years now (as per https://caniuse.com/es5).

anematode 17 hours ago
The desire to keep things compatible with even ES6, let alone ES5 and before, is utterly bizarre to me. Then you see folks who unironically want to maintain compatibility with node 0.4, in 2025, and realize it could be way worse....

Ironically, what often happens is that developers configure Babel to transpile their code to some ancient version, the output is bloated (and slower to execute, since passes like regenerator have a lot of overhead), and then the website doesn't even work on the putatively supported ancient browsers because of the use of recent CSS properties or JS features that can't be polyfilled.

I've even had a case at work where a polyfill caused the program to break. iirc it was a shitty polyfill of the exponentiation operator ** that didn't handle BigInt inputs.

Slothrop99 12 hours ago
Maybe I didn't look hard enough, but there's no obvious switch to "just turn off all the legacy stuff, thnx".

Also, there has been a huge amount of churn on the tooling side, and if you have a legacy app, you probably don't wanna touch whatever build program was cool that year. I've got a react app which is almost 10 years old, there has to be tons of stuff which is even older.

vkou 12 hours ago
> Maybe I didn't look hard enough, but there's no obvious switch to "just turn off all the legacy stuff, thnx".

There is. Break compatibility for it, and whatever poor bastard that is still maintaining software that is targeting a PalmPilot is free to either pin to an older version of your library, or fork it. Yes, that's a lot of pain for him, but it makes life a little easier for everyone else.

josephg 9 hours ago
This is my philosophy too. If the nodejs project doesn't support node 18, why on earth should I?

Here's the schedule, if anyone hasn't seen it. Node 18 is EOL. Node 20 goes EOL in a bit over a month.

https://nodejs.org/en/about/previous-releases

Slothrop99 11 hours ago
[dead]
Pxtl 15 hours ago
It's just an excuse to not change things.
fragmede 16 hours ago
Just how old an Android device in the developing world do you not want to support? Life's great at the forefront of technology, but there's a balancing act to be able to support older technology vs the bleeding edge.
anematode 16 hours ago
I like the sentiment, but building a website that can actually function in that setting isn't a matter of mere polyfills. You need to cut out the insane bloat like React, Lottie, etc., and just write a simple website, at which point you don't really need polyfills anyway.

In other words, if you're pulling in e.g. regenerator-runtime, you're already cutting out a substantial part of the users you're describing.

Dylan16807 16 hours ago
A quick search tells me that firefox 143 from 6 months ago supported android 5 (Lollipop).

So that's my cutoff.

dfabulich 16 hours ago
Android phones update to the latest version of Chrome for 7 years. As long as you're using browser features that are Baseline: Widely Available, you'll be using features that were working on the latest browsers in 2023; those features will work on Android 7.0 Nougat phones, released in 2016.

Android Studio has a nifty little tool that tells you what percentage of users are on what versions of Android. 99.2% of users are on Android 7 or later. I predict that next year, a similar percentage of users will be on Android 8 or later.

kennywinker 15 hours ago
3.9 billion android users, means that 0.8% is 31 million people - and for a very small number of developers most of their users will be from that slice. For most of them… yeah go ahead an assume your audience is running a reasonably up to date os
oflebbe 12 hours ago
Websites built with tons of polyfills are likely not run on these devices anyway, since they will run out of RAM before, let alone after they will only load after sone minutes because of CPU limitations on top of not being loaded because their x509 certs are outdated as well as the bandwith they support is not suitable to load MB sited pages
hsbauauvhabzb 16 hours ago
I’ve been very lost trying to understand the ecosystem between es versions , typescript and everything else. It ends up being a weird battle between seemingly unrelated things like require() vs import vs async when all I want to do is compile. All while I’m utterly confused by all the build tools, npm vs whatever other ones are out there, vite vs whatever other ones are out there, ‘oh babel? I’ve heard the name but no idea what it does’ ends up being my position on like 10 build packages.

This isn’t the desire of people to build legacy support, it’s a broken, confusing and haphazard build system built on the corpses of other broken, confusing and haphazard build systems.

anematode 14 hours ago
Honestly, Vite is all you need. :) It's super flexible compared to the status quo of require vs. import etc. For example, I recently wanted to ship a WASM binary along with the JS rather than making it a separate download (to avoid having to deal with the failure case of the JS code loading and the WASM not fetching). All I had to do was import `a.wasm?url` and it did the base64 embedding and loading automatically.
Maxion 13 hours ago
This sentiment is all well and good, but when you end up in a new-to-you JS codebase with a list of deps longer than a Costco receipt using some ancient Webpack with it's config split into 5 or so files, then no-one is letting you upgrade to vite unless the site is completely down.
tankenmate 8 hours ago
It's almost like Churchill's quip "He has all the virtues I dislike and none of the vices I admire". In other words, in some ways the JS ecosystem rushes to all the tech debt inducing "shiny shiny" and avoids all the tech debt reducing "hard work of refactoring and wisdom". It's almost like a large chunks of the JS ecosystem thrives on "the dopamine hit". Santayana's wisdom whispers behind every import.
anematode 13 hours ago
Sad but true...
CoderLuii 15 hours ago
this is exactly where i landed too. i build docker images that bundle node tooling and every time i think i understand the build system something changes. require vs import, cjs vs esm, babel vs swc vs esbuild, then half your dependencies use one format and half use the other. the worst part is when you containerize it because now you need it all to work in a clean linux environment with no cached state and suddenly half the assumptions break.
conartist6 15 hours ago
Yes, yes to all of that, but there is still hope.
hsbauauvhabzb 15 hours ago
This fancy new build tool with emojis will fix it!
kennywinker 15 hours ago
This fancy new vibe coded build tool with emojis
hsbauauvhabzb 11 hours ago
Built in rust
conartist6 7 hours ago
Mmmmhm. But not all the people building devtools got distracted by a pretty butterfly.
tgv 11 hours ago
> Ancient browser support is a thing

And weird browser support.

People use the oddest devices to do "on demand" jobs (receiving a tiny amount of money for a small amount of work). Although there aren't that many, I've seen user agents from game consoles, TVs, old Androids, iPod touch, and from Facebook and other "browser makers", with names such as Agency, Herring, Unique, ABB, HIbrowser, Vinebre, Config, etc. Some of the latter look to be Chrome or Safari skins, but there's no way to tell; I don't know what they are. And I must assume that quite a few devices cannot be upgraded. So I support old and weird browsers. The code contains one externally written module (stored in the repository), so it's only a matter of the correct transpiler settings.

hrmtst93837 13 hours ago
The root isuue is that the web rewards shipping now and fixing later so old deps and conservative targts linger until stuff breaks.
ivanjermakov 5 hours ago
Not just web, gamedev is suffering this too, since ~2020.
Tade0 10 hours ago
Sometimes it's a result of unforeseen consequences of design decisions.

All pre-signal Angular code must be compiled down to JS which replaces native async with Promise.

Why is that so? For a long time Angular's change detection worked by overriding native functions like setTimeout, addEventListener etc. to track these calls and react accordingly. `async` is a keyword, so it's not possible to override it like that.

Signals don't require such trickery and also allow to significantly decrease the surface area of change detection, but to take advantage of all of that one has to essentially rewrite the entire application.

userbinator 17 hours ago
The newer version is often even more bloated. This whole article just reinforces my opinion of "WTF is wrong with JS developers" in general: a lot of mostly mindless trendchasing and reinventing wheels by making them square. Meanwhile, I look back at what was possible 2 decades ago with very little JS and see just how far things have degraded.
jazzypants 7 hours ago
Literally nothing has degraded. What in the world are you talking about? All of this stuff is optional.
michaelchisari 16 hours ago
A standard library can help, but js culture is not built in a way that lends to it the way a language like Go is.

It would take a well-respected org pushing a standard library that has clear benefits over "package shopping."

halapro 14 hours ago
> WTF is wrong with JS developers

Don't confuse "one idiot who wants to support Node 0.4 in 2026" with "JS developers". Everybody hates this guy and he puts his hands into the most popular packages, introducing his junk dependencies everywhere.

saghm 13 hours ago
If everyone hates him and thinks his dependencies are junk, why would anyone let him introduce them to popular packages? Clearly there are at least some people who are indifferent enough if the dependencies are getting added elsewhere
albedoa 1 hour ago
The guy you are responding to is longing for what was possible two decades ago. He is that one idiot. He even replied to your comment with confirmation!
Maxion 13 hours ago
The other problem is that this is a bit of a circular path, with deps being so crap and numerous, upgrading existing old projects become a pain. There are A LOT of old projects out there that haven't been updated simply because the burden to do so is so high.
userbinator 13 hours ago
Then I wish there were more of these "idiots who want to support Node 0.4 in 2026". Maybe they're the ones with the common sense to value stability and backwards compatibility over constantly trendchasing the new and shiny and wanting to break what was previously working in the misguided name of "progress".
josephg 9 hours ago
NodeJS has a clear support schedule for releases. Once a version of nodejs is EOL, the node team stops backporting security fixes. And you should really stop using it. Here's the calendar:

https://nodejs.org/en/about/previous-releases

Here's a list of known security vulnerabilities affecting old versions of nodejs:

https://nodejs.org/en/about/eol

In my opinion, npm packages should only support maintained versions of nodejs. If you want to run an ancient, unsupported version of nodejs with security vulnerabilities, you're on your own.

userbinator 56 minutes ago
"support" and "works" are two different things.
Griffinsauce 13 hours ago
You wouldn't if you look more deeply at this. He doesn't push for simplicity but for horrible complexity with an enormous stack of polyfills, ignoring language features that would greatly reduce all that bloat. .
userbinator 57 minutes ago
That's also a problem. I've written JS that would work on any browser from the latest ones all the way back to IE5, and I'm not even a professional JS developer. It's not hard.

Maybe "professional" is the problem: they're incentivised to make work for themselves so they deliberately add this fragility and complexity, and ignore the fact that there's no need to change.

12 hours ago
prinny_ 10 hours ago
I believe if you read this article https://www.artmann.co/articles/30-years-of-br-tags your "wtf is wrong with js developers" question will be answered.
hrmtst93837 14 hours ago
[dead]
prinny_ 10 hours ago
Everyone trash talking the JS ecosystem without contributing the slightest to the conversation would benefit a lot if they read https://www.artmann.co/articles/30-years-of-br-tags in order to understand the evolution of the language and its tooling.

Nobody argues what we currently have is great and that we shouldn't look to improve it. Reducing it to "JS developers bad" is an embarrassing statement and just shows ignorance, not only of the topic at hand, but of an engineering mindset in general.

follie 9 hours ago
I find the mindset of trying to understand and accept bad fine in moderation but as defeatist when taken past the end of the block. It doesn't matter why JS is bad and will harm your future prospects if you approach it with too much acceptance. We always need to be examining the practice in front of us and the theory that would be a better replacement for it and trying to make the leaps at the right times to keep getting paid while not becoming part of the problem ourselves.

Science advances one funeral at a time applies to software with things going at a faster pace so a good software engineer needs to fake a few funerals or really be senior at 4 years to be dead by 7.

KronisLV 9 hours ago
> “JS developers bad“

I found it to be a nice post that documents why things sometimes are bad. It didn’t feel accusatory at the developers themselves, but seemed to serve as a reasonable critique of the status quo?

n_e 9 hours ago
I assume they were talking about the comments here, not the post which I agree is great.
n_e 9 hours ago
[dead]
rtpg 15 hours ago
I think on the first point, we have to start calling out authors of packages which (IMO) have built out these deptrees to their own subpackages basically entirely for the purpose of getting high download counts on their github account

Like seriously... at 50 million downloads maybe you should vendor some shit in.

Packages like this which have _7 lines of code_ should not exist! The metadata of the lockfile is bigger than the minified version of this code!

At one point in the past like 5% of create-react-app's dep list was all from one author who had built out their own little depgraph in a library they controlled. That person also included download counts on their Github page. They have since "fixed" the main entrypoint to the rats nest though, thankfully.

https://www.npmjs.com/package/has-symbols

https://www.npmjs.com/package/is-string

https://github.com/ljharb

matheusmoreira 15 hours ago
> entirely for the purpose of getting high download counts on their github account

Is this an ego thing or are people actually reaping benefits from this?

Anthropic recently offered free Claude to open source maintainers of repositories with over X stars or over Y downloads on npm. I suppose it is entirely possible that these download statistics translate into financial gain...

domenicd 10 hours ago
Yes, there's definitely a financial gain aspect here. Tidelift provides $50/month for each of these packages. https://tidelift.com/lifter/search/npm/has-symbols

The incentives are pretty clear: more packages, more money.

g947o 7 hours ago
martijnvds 15 hours ago
I've seen people brag about it in their resumes, so I assume it helps them find (better paying?) work.
stephenr 13 hours ago
I'm completely apathetic about spicy autocomplete for coding tasks and even I wonder which terrible code would be worse.

The guy who wrote is even/odd was for ages using a specifically obscure method that made it slower than %2===0 because js engines were optimising that but not his arcane bullshit.

g947o 7 hours ago
https://immich.app/cursed-knowledge

> There is a user in the JavaScript community who goes around adding "backwards compatibility" to projects. They do this by adding 50 extra package dependencies to your project, which are maintained by them.

> 6/28/2024

CoderLuii 15 hours ago
from a security perspective this is even worse than it looks. every one of those micro packages is an attack surface. we just saw the trivy supply chain get compromised today and thats a security tool. now imagine how easy it is to slip something into a 7 line package that nobody audits because "its just a utility." the download count incentive makes it actively dangerous because it encourages more packages not fewer.
h4ch1 15 hours ago
I remember seeing this one guy who infiltrated some gh org, and then started adding his own packages to their dependencies or something to pad up his resume/star count.

Really escapes me who it was.

g947o 7 hours ago
eudamoniac 4 hours ago
Christ. What a psycho.
h4ch1 7 hours ago
yes! this.
technion 13 hours ago
As usual, there's a cultural issue here. I know it's entirely possible to paste those seven lines of code into your app. And in many development cultures this will be considered a good thing.

If you're working with Javascript people, this is referred to as "reinventing the wheel" or "rolling your own", or any variation of "this is against best practice".

rtpg 13 hours ago
I think the fact that everyone cites the same is-number package when saying this is indicative of something though.

Like I legit think that we are all imagining this cultural problem that's widespread. My claim (and I tried to do some graph theory stuff on this in the past and gave up) is that in fact we are seeing something downstream of a few "bad actors" who are going way too deep on this.

I also dislike things like webpack making every plugin an external dep but at least I vaguely understand that.

serial_dev 13 hours ago
Have you heard of the left pad incident?

The problem is not imagined.

saghm 13 hours ago
The point isn't that everyone needs to write the same code manually necessarily. It's that an author could easily just combine the entire tree of seven line packages into the one package the create-react-app uses directly. There's no reason to have a dozen or so package downloads each with seven lines of code instead of one that that's still under under a hundred lines; that's still a pretty small network request, and it's not like dead code analysis to prune unused functions isn't a thing. If you somehow find yourself in a scenario where you would be happy to download seven lines of code, but downloading a few dozen more would be an issue, that's when you might want to consider pasting the seven lines of code manually, but I honestly can't imagine when that would be.
stephenr 13 hours ago
The problem I think is that the js community somehow thinks that being on npm is some bastion of good quality.

Just as the cloud is simply someone else's computer, a package is just someone else's reinvented wheel.

The problem is half the wheels on npm are fucking square and apparently no one in the cult of JavaScript realises it.

robpalmer 10 hours ago
The article and (overall) this comments section has thankfully focused on the problem domain, rather than individuals.

As the article points out, there are competing philosophies. James does a great job of outlining his vision.

Education on this domain is positive. Encouraging naming of dissenters, or assigning intent, is not. Folks in e18e who want to advance a particular set of goals are already acting constructively to progress towards those goals.

whstl 7 hours ago
People aren't criticizing the development philosophy in this subthread. This has been done by the article itself and by several people before.

What people are criticizing is the approach in pushing this philosophy into the ecosystem for allegedly personal gain.

The fact that this philosophy has been pushed by a small number of individuals shows this is not a widespread belief in the ecosystem. That they are getting money out of the situation demonstrates that there is probably more to the philosophy than the technical merits of it.

This is a discussion that needs to happen.

hinkley 15 hours ago
Hat tip to Sindre who has fifty bagillion packages but few of them depend on more than one of his other packages.
12345hn6789 7 hours ago
stephenr 15 hours ago
As usual, he's copying someone else who's been doing this for years:

https://www.npmjs.com/package/is-number - and then look and see shit like is odd, is even (yes two separate packages because who can possibly remember how to get/compare the negated value of a boolean??)

Honestly for how much attention JavaScript has gotten in the last 15 years it's ridiculous how shit it's type system really is.

The only type related "improvement" was adding the class keyword because apparently the same people who don't understand "% 2" also don't understand prototypal inheritance.

zahlman 14 hours ago
To be fair, prototypal inheritance is relatively uncommon language design. I'd rank it as considerably harder to understand than the % operator.
stephenr 13 hours ago
That's a good point, it's only been around for 30 years, and used on 95% of websites. It's not really popular enough for a developer to take an hour or two to read how it works.
saghm 13 hours ago
The word "used" is doing some heavy lifting there. Not all usage is equal, and the fact that it's involved under the hood isn't enough to imply anything significant. Subatomic physics is used by 100% of websites and has been around for billions of years, but that's not a reason to expect every web developer to have a working knowledge of electron fields.
stephenr 12 hours ago
Fair point.

Let's compromise and say that whoever is responsible for involving (javascript|electron fields) in the display of a website, should each understand their respective field.

I don't expect a physicist or even an electrical engineer or cpu designer to necessarily understand JavaScript. I don't expect a JavaScript developer to understand electron fields.

I do expect a developer who is writing JavaScript to understand JavaScript. Similarly I would expect the physicist/etc to understand how electrons work.

saghm 53 minutes ago
The issue with this framing is that understanding something isn't a binary; you don't need to be an expert in every feature of a programming language to be able to write useful programs in it. The comment above describing prototypical inheritance as esoteric was making the point that you conflated the modulus operator with it as if they're equally easy to understand. Your responses don't seem to indicate you agree with this.

It sounds like you expect everyone to understand 100% of a language before they ever write any code in it, and that strikes me as silly; not everyone learns the same way, and some people learn better through practice than by reading about thinks without practice. People sometimes have the perception that anyone who prefers a different way of learning than them is just lazy or stupid for not being able to learn in the way that they happen to prefer, and I think that's both reductive and harmful.

SachitRafa 16 hours ago
The cross-realm argument for packages like is-string is the one I find hardest to dismiss, but even there the math doesn't add up. The number of projects actually passing values across realms is tiny, and those projects should be the ones pulling in cross-realm-safe utilities, not every downstream consumer of every package that ever considered it. The deeper problem with Pillar 2 is that atomic packages made sense as a philosophical argument but broke down the moment npm made it trivially easy to publish. The incentive was "publish everything, let consumers pick what they need" but the reality is consumers never audit their trees,they just install and forget. So the cost that was supposed to be opt-in became opt-out by default. The ponyfill problem feels most tractable to me. A simple automated check "does every LTS version of Node support this natively?" could catch most of these. The e18e CLI is a good start but it still requires someone to run it intentionally. I wonder if something like a Renovate-style bot that opens PRs to remove outdated ponyfills would move the needle faster than waiting for maintainers to notice.
burntoutgray 17 hours ago
I have a single pillar, admittedly for in-house PWAs: Upgrade to the current version of Chrome then if your problem persists, we'll look into it.
yurishimo 11 hours ago
This is how it should be for internal stuff! Corporate IT wants everyone to update anyway so there really isn’t a downside.

One thing I kinda understand is users who want to use a more performant browser (safari really does sip memory I’ve found compared to chrome) but that’s kind of a side point. But if your company decides this is the browser(s) we support, then it makes sense and is the right way to go about it.

GianFabien 14 hours ago
Keeping it simple usually saves the day.
14 hours ago
AltruisticGapHN 9 hours ago
"some people apparently exist who need to support ES3 - think IE6/7, or extremely early versions of Node.js"

Seriously what kind of business today needs to support ES3 browsers? Even banking sites should refuse to run on such old devices out of security concerns.

skrebbel 9 hours ago
This is 100% teams who set up their build tooling back in 2015 and haven't updated since. There's plenty widely used apps and libs that date this far back, and back then, IE8 compat was considered pretty important still, esp for products targeting enterprise/government customers.

Upgrading eg Webpack and Babel and polyfill stacks and all that across multiple major versions is a serious mess. Lots of breaking changes all around. Much better to just ship features. If it ain't broke, don't fix it!

esprehn 8 hours ago
No one really does, but there's one particular individual who keeps pushing to support things like node 0.3 and who also maintains all those low level intrinsic packages.
jgilias 8 hours ago
I remember reading somewhere that Deutsche Bahn is running Windows 3.1 for something still?
szatkus 6 hours ago
Somehow I doubt they use it to access another Next.js app created by some startup from SV.
g947o 7 hours ago
https://immich.app/cursed-knowledge

> There is a user in the JavaScript community who goes around adding "backwards compatibility" to projects. They do this by adding 50 extra package dependencies to your project, which are maintained by them.

> https://github.com/immich-app/immich/pull/10690

12345hn6789 7 hours ago
A little more context since they scrubbed that PR.

https://news.ycombinator.com/item?id=45447390

or

https://github.com/A11yance/axobject-query/pull/354

This user actively gets paid off of how many downloads their packages get, which makes sense why there are so many. As well as the attitude to change others repositories to use his packages

wheattoast 9 hours ago
“Alternatively, what’d be really nice is if they upgraded“

Easy enough for y’all with techie salaries, but as one of the millions of poor folks whose paychecks barely (or don’t even) pay the bills, it’d be really nice if we didn't have to junkheap our backbreakingly expensive hardware every few years just cuz y’all are anorexically obsessed with lean code, and find complex dependancies too confusing/bothersome to maintain.

jefftk 9 hours ago
They're talking about people still running ES3 browser engines, like IE8, which was released 15+ years ago and went EOL 10+ years ago. The author could have done a better job clarifying this, but they're not pushing for a world with 2y device lifetimes.
abanana 6 hours ago
Indeed, they're talking about the opposite extreme from the usual problem we all bemoan in here, which is JS devs being determined to use the newest shiniest thing as soon as it's been announced, instead of being willing to continue to use what they've always used and to wait until the new stuff works across all browsers. This article really surprised me, in how far some are apparently going in the opposite direction. I'm very surprised the baseline mentioned is ES3 rather than ES5 or 6.

The GP's comment - that we have to upgrade our hardware because devs are "anorexically obsessed with lean code, and find complex dependancies too confusing/bothersome" - is surely the exact opposite of reality? We have to upgrade to faster hardware because the bloat slows everything down!

wheattoast 5 hours ago
Fair, but personally I’d absolutely prefer slower bloated code with twice the lifespan to faster code that forces me to buy new hardware I can’t afford. But I’m a nearly extinct type of consumer who happily clings to pre-subscription-era software (e.g., Photoshop 7, Sketchup 2017). I understand and begrudgingly accept that businesses couldn’t survive by tending to the desires of folks like me.
wheattoast 5 hours ago
Thanks for the clarification. I did not understand.

My knee-jerky reaction to the author’s blithe exhortation to upgrade stems from pain of watching as my prized workhorse (a 2015 MacBook) dies in my arms despite its magnificently healthy and powerful body.

socalgal2 2 hours ago
There was a time I'd use dependencies for trivial things like copying a file during building or running things in parallel. Now I just script those in js and call that js from my build. Even testing is now included in node so I stopped using a testing framework.
tylerchilds 5 hours ago
I agree, but would also posit a parallel The Three Pillars of JavaScript Ecosystem Bloat

for example, javascript runs in a browser or on microcontrollers. you can write code that work for both natively [1].

TypeScript-- a mechanism that needs to compile first into javascript

React-- a mechanism that needs to compile first into javascript

Configuration-de-jour-- Depending on how you need to string your TypeScript and React together, there's a thing you need to manage your javascript managers. Vite is the best option in this field, since it recognizes exposing tools to fine tune how to optimize your resulting javascript from your typescript and react is a terrible idea that leads to mass fragmentation on a global scale for what it even means to "spin up a js project"

In conclusion, is javascript a compile target like assembly or a language that people can handcode to eek performance out of like assembly?

[1]: https://github.com/bellard/mquickjs

tylerchilds 3 hours ago
To the downvote— you know JavaScript is the blitting engine for <company that pioneered the button on your remote to stream Hollywood to your living room on a shitty smart TV microcontroller>, right?
derodero24 7 hours ago
The polyfill treadmill is the one that gets me. I work on native Node addons so the JS layer is basically just a loader and types — no polyfills, no transpilation targets. Really puts into perspective how much of a typical npm package is compatibility layers for environments nobody actually runs anymore. josephg's right that just tracking Node's EOL schedule would kill a huge chunk of this overnight.
DanielHB 6 hours ago
> Atomic architecture

> [...]

> Each of these having only one consumer means they’re equivalent of inline code but cost us more to acquire (npm requests, tar extraction, bandwidth, etc.).

It costs FAR more than dep install time. It has a runtime cost too, especially if in frontend code using bundlers where it also costs extra bundlespace and extra build time.

algolint 10 hours ago
It's interesting how we've reached a point where 'vanilla' is seen as an obscure domain of knowledge. The 'gravity' of frameworks like React is so strong that for many new developers, the framework IS the web. Breaking out of that mindset often reveals that the browser has actually evolved quite a bit and can handle a lot of what we used to reach for libraries for, especially with Web Components and CSS Grid/Flexbox being so mature now.
whstl 7 hours ago
I like to criticize React as much as the next person, but this is an JS ecosystem problem around third-party libraries, not a React problem per se.

If you're using third-party NPM packages to do "Vanilla", you're will probably run into the same problem.

If you import React directly from a CDN, you won't.

algolint 2 hours ago
on the other hand, AI assisted coding may open avenue, when developers choose native over framework
algolint 2 hours ago
[dead]
procaryote 12 hours ago
The most frustrating thing with the "Atomic architecture" bit with tiny packages is how obviously stupid it is. Any borderline sane person should look at isOdd/isEven and see that it's an awful idea

Instead they've elevated it to a cultural pillar and think they've come up with a great innovation. It's like talking to antivaxers

RadiozRadioz 11 hours ago
It's because it has a smart-sounding name. Some people are shallow and performative; some nice-looking blog post says they can have "atomic architecture", then the trend starts and everybody wants to show how enlightened they are.
whstl 8 hours ago
It's not just the name or the smart explanation.

Atomic packages brings more money to the creators.

If you have two useful packages it's hard to ask for money, even if they're used by Babel or some popular React dependency.

If you have 900 packages that are transitive dependencies the same couple deps above, it's way easier to get sponsorship. This is a way to advertise themselves: "I maintain 1000 packages".

The first guy that did this in a not-nice way was a marketing/salesperson and has mentioned that they did on purpose to launch their dev career.

TLDR: This is just some weird ass pyramid thing to get Github sponsors or clout.

williamcotton 11 hours ago
That’s not how we started down this path. See snark-free sibling comment from padjo.
10 hours ago
RadiozRadioz 10 hours ago
Both my claim and theirs are unsupported by evidence, therefore they are equally valid.
padjo 10 hours ago
A third argument is that it was because of aliens from the planet Blotrox Prime. But I suppose without evidence we'll just have to accept that all three theories are equally probable.
RadiozRadioz 9 hours ago
Interesting how you decided to switch to hyperbole instead of providing evidence for your claim. Backing up your viewpoint would have easily shut me down, putting the ball in my court to do the same. Instead you gave a knee-jerk childish response.
padjo 9 hours ago
Interesting that rather than try to bolster your claim you resorted to a logical fallacy to justify it.
RadiozRadioz 8 hours ago
Hypocritical; you did the same with the hyperbole. Why are you stooping to my level instead of being the better person?
padjo 8 hours ago
Nope. Just a reductio as absurdum that you decided to counter by asking that I maintain higher standards of debate than you.

The notion that atomic architecture came about because people are stupid and performative is not really useful. Its fairly misanthropic and begs the question why it became so prevalent in JS specifically.

padjo 11 hours ago
The philosophy was kinda refreshing in the early days. There was a really low barrier to publishing and people were encouraged to build and share tools rather than hoard things. It was probably somewhat responsible for the success of npm and the node ecosystem, especially given the paltry standard lib.

Of course, like most things, when taken to an extreme it becomes absurd and you end up with isOdd.

egeozcan 10 hours ago
I think the issue is that the JavaScript ecosystem is so large that even the strangest extremes manage to survive. Even if they resonate with just 0.1% of developers, that’s still a lot of developers.

The added problem with the atomic approach is that it makes it very easy for these fringes to spread throughout the ecosystem. Mostly through carelessness, and transitive dependencies.

IsTom 11 hours ago
I've seen some juniors writing risoni code like that. They've heard that you shouldn't write big functions, so obviously they forcefully split things until they can't be split anymore.
groundzeros2015 6 hours ago
> It's like talking to antivaxers

This is not helpful.

il-b 15 hours ago
The elephants in the room are react and webpack.
est 15 hours ago
More like a nodejs bloat rather than JS bloat.

For personal objects I always prompt the AI to write JS directly, never introduce nodejs stack unless absolutely have to.

Turns out you don't always need Nodejs/Reactto make a functional SPA.

kennywinker 15 hours ago
You’ve traded supply chain vulnerability for slop vulnerability.
yurishimo 11 hours ago
Except your supply chain could also be slop and you have no idea (unless you’re auditing your dependencies, right?).

I’d take vibe coded vanilla js slop over npm dependency hell every day of the week.

wiseowise 11 hours ago
> Using the e18e CLI to detect replaceable dependencies

https://github.com/e18e/cli

That’s awesome. Could be hooked as a pre-commit for agents to do the grunt work of migration.

gameroman 4 hours ago
e18e CLI is great
skrtskrt 15 hours ago
the fact that you can just redefine Map in a script is mind boggling
xigoi 10 hours ago
Why? Being able to redefine anything is table stakes in dynamic languages.
fragmede 7 hours ago
You can't assign a value to false, for example, so "anything" isn't everything (Node v22.17.0).

    > false = 4
    false = 4
    ^^^^^
    
    Uncaught SyntaxError: Invalid left-hand side in assignment
Fascinatingly enough though, you can assign a value to NaN. It doesn't stick tho.

    > NaN
    NaN
    > NaN = 42
    42
    > NaN
    NaN
    >
(map behaves as described.)
g947o 7 hours ago
So? Does being able to do something means you should?
xigoi 6 hours ago
I didn’t say you should. The comment I’m replying to expressed surprise that redefining something is even possible.
lerp-io 12 hours ago
just make react native to browser and everything else thats a one off can be ai generated
skydhash 17 hours ago
Fantastic write up!

And we're seeing rust happily going down the same path, especially with the micro packages.

chrismorgan 6 hours ago
Rust is not going down the same path, and it’s ludicrous to suggest it is. Almost none of the first and third pillars are even possible in Rust, and to the extent they are, they’re not a problem in practice. As for the second, “atomic architecture”, it’s not taken anywhere near the extreme it frequently is with npm. There are not many micro-packages that get used, and where they are, they mostly make more sense than they did in npm, and they don’t have anywhere near the cost they do in npm, and can have some concrete advantages.
vsgherzi 12 hours ago
Yeah I’m in the same boat here I really don’t like the dependency sprawl of rust. I understand there’s tradeoffs but I really wanna make sure we don’t end up like npm
CoderLuii 15 hours ago
the docker side of this is painful too. every extra dependency in any language means a bigger image, more layers to cache, more things that can break during a multi-arch build. ive been building images that are 4GB because of all the node and python tooling bundled in. micro packages make it worse because each one adds metadata overhead on top of the actual code.
cute_boi 16 hours ago
Rust is different as there is no runtime.
wiseowise 11 hours ago
Yes, instead we pay with requiring supercomputers and 10 hour compile times to process billion of those “atomic architecture”.
b00ty4breakfast 13 hours ago
I'm not very familiar with rust but I'm pretty sure it has a runtime. Even C has a runtime.

Unless you're talking about an "environment" eg Node or the like

embedding-shape 10 hours ago
Indeed Rust has a runtime, I'm not sure why the whole "Rust has no runtime" comes from, I keep seeing it repeated from time to time, but can't find the origin of this, I don't think it's ever been true?
onlyspaceghost 15 hours ago
but it still increases compile time, attack surface area, bandwidth use, etc.
vsgherzi 12 hours ago
I’m assuming you’re referring to an async runtime like tokio. In my option the dependency problem exists with or without tokio. Tokio is probably one of the best dependencies
IAmLiterallyAB 14 hours ago
For the old version support. Why not do some compile time #ifdef SUPPORT_ES3? That way library writers can support it and if the user doesn't need it they can disable it at compile time and all the legacy code will be removed
sgbeal 8 hours ago
> Why not do some compile time #ifdef SUPPORT_ES3?

Rather unfortunately, JS has no native precompiler. For the SQLite project we wrote our own preprocessor to deal with precisely that type of thing (not _specifically_ that thing, but filtering code based on, e.g., whether it's vanilla, ESM, or "bunlder-friendly" (which can't use dynamically-generated strings because of castrated tooling)).

Griffinsauce 12 hours ago
Two problems: - people would need to know how to effectively include dependencies in a way that allows them to be tree shaken, that's a fragile setup - polyfills often have quirks and extra behaviours (eg. the extra functions on early promise libraries come to mind ) that they start relying on, making the switch to build-in not so easy

Also, how is this going to look over time with multiple ES versions?

sgbeal 8 hours ago
> people would need to know how to effectively include dependencies in a way that allows them to be tree shaken

Is the need for tree-shaking not 100% a side-effect of dependency-mania? Does it not completely disappear once one has ones dependencies reduced to their absolute minimum?

Maybe i'm misunderstanding what tree-shaking is really for.

ascorbic 12 hours ago
It'll still install the dependencies, which is what this is about
15 hours ago
turtleyacht 17 hours ago
It would be interesting to extend this project where opt-in folks submit a "telemetry of diffs," to track how certain dependencies needed to be extended, adapted, or patched; those special cases would be incorporated as future features and new regression tests.

Someday, packages may just be "utility-shaped holes" in which are filled in and published on the fly. Package adoption could come from 80/20 agents [1] exploring these edges (security notwithstanding).

However, as long as new packages inherit dependencies according to a human author's whims, that "voting" cycle has not yet been replaced.

[1] https://news.ycombinator.com/item?id=47472694

casey2 13 hours ago
There is a clear and widespread cultural problem with javascript. Sites should think seriously hard about server side rendering, both for user privacy (can't port the site to i2p if you drop 5MB every time they load a page) and freedom. Even this antibloat site smacks you with ~100KB and links to one that smacks you with ~200KB. At this rate if you follow 20 links you'll hit a site with 104 GB of JS.
g947o 7 hours ago
> Sites should think seriously hard about server side rendering

You think the average site owner plus wix/squarespace is going to spend a lot of money beefing up their CPU and RAM to marginally "improve user experience" when they could and have been offloading rendering client side all these years?

sgbeal 8 hours ago
> Sites should think seriously hard about server side rendering...

The rise of AI crawlers makes that ever less appetizing. Moving the workloads to the client is, among other things, a form of DoS mitigation.

sheept 17 hours ago
I wonder this means there could be a faster npm install tool that pulls from a registry of small utility packages that can be replaced with modern JS features, to skip installing them.
seniorsassycat 16 hours ago
Not sure about faster, but you could do something with overrides, especially pnpm overrides since they can be configured with plugins. Build a list of packages that can be replaced with modern stubs.

It couldn't inine them, but it could replace ponyfils with wrappers for native impls, and drop the fallback. It could provide simple modern implementations of is-string, and dedupe multiple major versions, tho that begs the question what breaking change lead to a new mv and why?

stephenr 15 hours ago
The primary cause of JS bloat is assuming you need JS or that customers want whatever you're using it to provide.

For $client we've taken a very minimal approach to JavaScript, particularly on customer facing pages. An upcoming feature finally replaces the last jquery (+ plugin) dependent component on the sales page, with a custom implementation.

That change shaved off ~100K (jquery plus a plugin removed) and for most projects now that probably seems like nothing.

The sales page after the change is now just 160K of JS.

The combination of not relying on JS for everything and preferring use-case-specific implementations where we do, means we aren't loading 5 libraries and using 1% of each.

I'm aware that telling most js community "developers" to "write your own code" is tantamount to telling fish to "just breathe air".

CoderLuii 15 hours ago
160K total is impressive. most landing pages i see are shipping 2-3MB of js before the first paint. the "write your own code" approach gets laughed at but when you actually do it the result is faster, easier to debug, and you dont wake up one morning to find out one of your 200 dependencies got compromised.
stephenr 9 hours ago
Wait till I tell those people we keep all our dependencies (js and backend) in our own git repo.

Updating dependencies is a task a person does, followed by committing the changes to the repo.

I am aware a lot of these ideas are heretical to a lot of software developers these days.

wonnage 12 hours ago
An underappreciated source of bloat is module duplication stemming from code splitting. SPAs have a bad rep because you don't expect to download an entire app just to load one page on the web. You can solve this by code splitting. But if you just naively split your app by route, you'll end up with duplicate copies of every shared module.

Bundlers handle this by automatically creating bundles for shared modules. But if you optimize to avoid all shared modules, you end up with hundreds of tiny files. So most bundlers enforce a minimum size limit. That's probably fine for a small app. But one or more of these things happens:

1. Over time everybody at the company tends to join one giant SPA because it's the easiest way to add a new page. 2. Code splitting works so well you decide to go ham and code split all of the things - modals, below-the-fold content, tracking scripts, etc.

Now you'll run into situations where 20 different unrelated bundles happen to share a single module, but that module is too small for the bundler to split out, and so you end up downloading it N times.

sylware 7 hours ago
It is not javascript itself (until the interpreter is written in plain and simple C or similar), it is the abomination of the web engine, one of the 2.5 from the whatng cartel.
sipsi 16 hours ago
i suggess jpeg.news dot com
steveharing1 12 hours ago
So the guy who called JS, a weird language was not wrong huh?
qayxc 11 hours ago
Look at Python - similar story. Once a reasonably usable global package registry exists, this is exactly what happens. Languages and standard libraries evolve, shipped code more often than not doesn't.
steveharing1 11 hours ago
Makes sense
ctvdev 9 hours ago
[dead]
irenetusuq 11 hours ago
[dead]
leontloveless 16 hours ago
[dead]
huhulove1990 6 hours ago
[dead]
11 hours ago
12 hours ago
butILoveLife 8 hours ago
[dead]
12 hours ago
13 hours ago
12 hours ago
hknzerodark1 8 hours ago
[dead]
hknzerodark1 11 hours ago
[dead]
grishka 13 hours ago
Yes, of course the tiny packages cause some of the bloat. As mainly a Java developer being pretty paranoid about my dependency tree (I'm responsible for every byte of code I ship to my users, whether I wrote it or not), I'm always blown away by JS dependency trees. Why would you reach for a library for this three-line function? Just write it yourself, ffs.

But the real cause of JS bloat is the so-called "front-end frameworks". Especially React.

First of all, why would you want to abstract away the only platform your app runs on? What for? That just changes the shape of your code but it ends up doing the same thing as if you were calling browser APIs directly, just less efficiently.

Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?

Seriously, I don't understand modern web development. Neither does this guy who spent an hour and some to try to figure out React from the first principles using much the same approach I myself apply to new technologies: https://www.youtube.com/watch?v=XAGCULPO_DE

padjo 11 hours ago
> you could just modify your real DOM straight from your networking code

You can also use your underparts as a hat. It doesn't mean its a good idea.

grishka 8 hours ago
You imply that you somehow get a visibly different end result if you touch DOM directly. Except to me, using React instead of a simple assignment to e.g. update the text on a button feels like taking several long flights that complete a lap around the world just to get from LA to SF, instead of the 1-hour direct flight.
padjo 6 hours ago
It's a case of Chesterton's fence. Having built complex apps pre-react, I wouldn't be in a hurry to go back to that approach because I have first hand experience of running into the problems it solves.
skydhash 7 hours ago
React is a paradigm change (from imperative to functional) that makes sense in a large UI project. React itself is fairly small in terms of deps.

The main issue is the tooling. JSX is nice enough (not required though) to want a transpiler that will also bundle you app. It’s from that point things get crazy. They want the transpiler to also be a bundler so that it manages their css as well. They also want it to do minification and dead code elimination. They want it to support npm dependencies,etc…

This is how you get weird ecosystems.

ascorbic 12 hours ago
That's like asking "why would you use Swing when you can use Graphics2D". Sometimes you want something higher level. The DOM is great and very powerful, but when you're building a highly interactive web app you don't want to be manually mutating the DOM every time state changes.

I am a core maintainer of Astro, which is largely based around the idea that you don't need to always reach for something like React and can mostly use the web platform. However even I will use something like React (or Solid or Svelte or Vue etc) if I need interactivity that goes beyond attaching some event listeners. I don't agree with all of its design decisions, but I can still see its value.

srdjanr 11 hours ago
Regarding tiny packages, I don't think they affect the size of shipped bundle at all. They only bloat your local dev environment.
wiseowise 11 hours ago
> Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?

https://youtu.be/Q9MtlmmN4Q0?t=519&is=Wt3IzexiOX4vMPZf

Also, why do you use SQL and databases? Couldn’t you just modify files on the filesystem?

grishka 8 hours ago
Yes, I don't understand "declarative" approach at all, it seems too wasteful and roundabout to me. You want to change something? You go and change it. That simple. I hate it when callbacks are abstracted away from me. Abstractions over callbacks always feel like they're getting in the way, not helping me.

> Also, why do you use SQL and databases? Couldn’t you just modify files on the filesystem?

Anyone can read a MySQL data file. IIRC the format is pretty straightforward. The whole point of doing it through the real MySQL server is to make use of indexes, the query optimizer, and proper handling of concurrency, at least. Sure you can reimplement those things, but at this point congrats, you've just reimplemented the very database system you were trying to avoid, just worse.

thomasikzelf 10 hours ago
The declarative vs imperative example is strange here. Why is the imperative example so convoluted? This is what one could write in js:

  badge.textContent = count > 99? '99+' : count
  badge.classList.toggle('show', count > 0)
  paper.classList.toggle('show', count > 0)
  fire.classList.toggle('show', count > 99)
The declarative example also misses the 99+ case. I don't think this example describes the difference between imperative and declarative well.
panstromek 12 hours ago
Yea, honestly you probably just don't understand. FE frameworks solve a specific problem and they don't make sense unless you understand that problem. That TSoding video is a prime example of that - it chooses a trivial instance of that problem and then acts like the whole problem space is trivial.

To be fair, React is especially wasteful way to solve that problem. If you want to look at the state od the art, something like Solid makes a lot more sense.

It's much easier to appreciate that problem if you actually try to build complex interactive UI with vanilla JS (or something like jQuery). Once you have complex state dependency graph and DOM state to preserve between rerenders, it becomes pretty clear.

grishka 8 hours ago
One of my projects does have a complex UI and is built with zero runtime dependencies on the front end. It doesn't require JS at all for most of its functionality.

I just render as much as possible on the server and return commands like "hide the element with that ID" or "insert this HTML after element with that ID" in response to some ajax requests. Outside of some very specific interactive components, I avoid client-side rendering.

panstromek 7 hours ago
That's good and arguably the right default for most websites.
skydhash 7 hours ago
I agree with you. It’s baffling to see websites (not web apps) refusing to show anything if you disable JS. And a lot of such web apps don’t need to be SPA (GitHub,…)

SPA was mean for UI that relies on the client state mostly, not on the server data (figma and other kind of online editors).

krmbzds 15 hours ago
JavaScript bloat is downstream of low FED interest rates.
general_reveal 10 hours ago
Anyone want to tell him programming languages don’t matter anymore?
embedding-shape 10 hours ago
What do you use to build programs then? Or maybe you're not a software developer, then maybe I understand not fully knowing how a program gets built, but otherwise, languages will be needed for as long as we need programs.
grey-area 7 hours ago
Were you this excited about crypto and NFTs as well?
g947o 7 hours ago
No. They matter.

Show me a website where client side interaction is implemented in perl.

miranaproarrow 10 hours ago
there are people that still likes to understand what their language is doing and not offload all their thought to LLM
Zopieux 10 hours ago
[dead]
pjmlp 13 hours ago
What about only writing JavaScript when it is actually required, instead of SPAs for any kind of content?

There will be almost no bloat to worry about.

deanc 10 hours ago
I can't help but think that whenever we have these discussions about dependency hell in the JS ecosystem that the language moves too slowly to add things to stdlib. For me, this is where bun fills the gap and they continue to pump out core stdlib packages that replace widely used dependencies. I'd really like to see Node at least do this more.
onion2k 12 hours ago
Fallback support is a legitimate reason for additional code being in the bundle, but it's not 'bloat' because it's necessary. In an ideal world every website would generate ES5, ES6, and ES2025 bundles and serve the smallest one that's necessary for the app to run based on the browser capabilities, but that is genuinely quite hard to get right and the cost of getting it wrong is a broken app so it's understandable why devs don't.

The other two, atomic architecture and ponyfills, are simply developer inexperience (or laziness). If you're not looking at the source of a package and considering if you actually need it then you're not working well enough. And if you've added code in the past that the metrics about what browsers your visitors are using show isn't needed any more, then you're not actively maintaining and removing things when you can. That's not putting the user first, so you suck.

srdjanr 11 hours ago
Bloat is mostly added by package authors, not website authors. And they can't know who's running it and can't look at the metrics. I doubt many website authors directly use isEven or polyfills.