Isn't this typical of what you get when you have a language that's easy for newcomers to pick up?
At one point it was PHP. Then it was Rails. Of late it's been Node. I'm sure Windows users might have some old VB war stories. (Admittedly, I'm not sure what to make of the Java StackOverflow tag, seeing that it's also is a sewers. Presumably school or Android related.)
> Redundancy sucks. Redundancy always means duplicated efforts, and sometimes interoperability problems. But dependencies are worse. The only reasonable thing to depend on is a full-fledged, real module, not an amorphous bunch of code. You can usually look at a problem and guess quite well if its solution has good chances to become a real module, with a real owner, with a stable interface making all its users happy enough. If these chances are low, pick redundancy. And estimate those chances conservatively, too. Redundancy is bad, but dependencies can actually paralyze you. I say – kill dependencies first.
That is an excellent quote that says so much of what I've been trying to convey to my peers for some time, but much terser and more eloquently. Thank you for sharing!
There is no such a thing as zalgo characters. It is plain unicode (utf-8 shaped), in particular it is an abuse of combining characters. Disabling those can cripple non-english languages.
I'm actually not sure where the term Zalgo comes from. I know the effect is abused Unicode feature, but was never sure how does the term "Zalgo" end up explaining it.
While this is an extreme case of course, packages by themselves are quite cheap (apart from installing times), and usually if the author tries to keep the dependencies in check it doesn't really create a mess.
> In practice though, even the smallest toy project will pull out hundred tiny dependencies.
I had a problem with running out of inodes on my server because of the massive number of files that end up in node_modules for even a smallish project.
NPM caused horrendous issues in my dev environment a while back. I was developing on a Windows 7 machine with my Node environment in a VirtualBox VM. I was keeping my code in a VirtualBox shared folder so I could edit it in my IDEs on the windows side and then easily run `npm install` etc. on the linux side.
Turns out, the combination of vboxsf and ntfs masquerading as a linux filesystem is not without bugs-
Sometimes, the directory depths that npm reached caused the filesystem to "soft crash", where it would just fail to do operations on files, meaning you had to run `npm install` a few times to actually grab all the dependencies.
Other times, it caused the host machine to blue screen.
One of the times this happened, it had the side effect of irreversibly corrupting my favourite programming font (Fira Code) for all JetBrains IDEs on that Windows install.
Not so much a problem now I'm not being forced to develop in Windows. But even though I dug up bugs in unrelated pieces of software / the host OS kernel, npm was still pushing that software to the limits with that crazy recursive directory structure it was attempting to build.
Another classic is installing one dependency with NPM and having it blow up because it's hit the 255 character limit for file paths on Windows due to nested dependencies.
Grouping all these tiny libraries under an umbrella project will just end up like PHP. No consistency, and various levels of repetition and redundancy and quality.
Having them separate like this at least means that if one library doesn't implement things 'right' you can switch to another easily.
At one point it was PHP. Then it was Rails. Of late it's been Node. I'm sure Windows users might have some old VB war stories. (Admittedly, I'm not sure what to make of the Java StackOverflow tag, seeing that it's also is a sewers. Presumably school or Android related.)
Edit: good related reading:
http://yosefk.com/blog/redundancy-vs-dependencies-which-is-w...
> Redundancy sucks. Redundancy always means duplicated efforts, and sometimes interoperability problems. But dependencies are worse. The only reasonable thing to depend on is a full-fledged, real module, not an amorphous bunch of code. You can usually look at a problem and guess quite well if its solution has good chances to become a real module, with a real owner, with a stable interface making all its users happy enough. If these chances are low, pick redundancy. And estimate those chances conservatively, too. Redundancy is bad, but dependencies can actually paralyze you. I say – kill dependencies first.