Problems of package management are sapping productivity for tech workers

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: lawrence@krubner.com, or follow me on Twitter.

It is especially bad on the front end:

The situation with packages and dependency hell today is horrendous, particularly if you work in a highly dynamic environment like web development.
I want to illustrate this with a detailed example of something I did just the other day, when I set up the structure for a new single page web application. Bear with me, this is leading up to the point at the end of this post.
To build the front-end, I wanted to use these four tools:
– jQuery (a JavaScript library)
– Knockout (another JavaScript library)
– SASS (a preprocessor to generate CSS)
– Jasmine (a JavaScript library/test framework)
Notice that each of these directly affects how I write my code. You can install any of them quite happily on its own, with no dependencies on any other tool or library. They are all actively maintained, but if what you’ve got works and does what you need then generally there is no need to update them to newer versions all the time either. In short, they are excellent tools: they do a useful job so I don’t have to reinvent the wheel, and they are stable and dependable.
In contrast, I’m pretty cynical about a lot of the bloated tools and frameworks and dependencies in today’s web development industry, but after watching a video[1] by Steven Sanderson (the creator of Knockout) where he set up all kinds of goodies for a large single page application in just a few minutes, I wondered if I was getting left behind and thought I’d force myself to do things the trendy way.
About five hours later, I had installed or reinstalled:
– 2 programming languages (Node and Ruby)
– 3 package managers (npm with Node, gem with Ruby, and Bower)
– 1 scaffolding tool (Yeoman) and various “generator” packages
– 2 tools that exist only to run other software (Gulp to run the development tasks, Karma to run the test suite) and numerous additional packages for each of these so they know how to interact with everything else
– 3 different copies of the same library (RequireJS) within my single project’s source tree, one installed via npm and two more via Bower, just to use something resembling modular design in JavaScript.
And this lot in turn made some undeclared assumptions about other things that would be installed on my system, such as an entire Microsoft Visual C++ compiler set-up. (Did I mention I’m running on Windows?)
I discovered a number of complete failures along the way. Perhaps the worst was what caused me to completely uninstall my existing copy of Node and npm — which I’d only installed about three months earlier — because the scaffolding tool whose only purpose is to automate the hassle of installing lots of packages and templates completely failed to install numerous packages and templates using my previous version of Node and npm, and npm itself whose only purpose is to install and update software couldn’t update Node and npm themselves on a Windows system.
Then I uninstalled and reinstalled Node/npm again, because it turns out that using 64-bit software on a 64-bit Windows system is silly, and using 32-bit Node/npm is much more widely compatible when its packages start borrowing your Visual C++ compiler to rebuild some dependencies for you. Once you’ve found the correct environment variable to set so it knows which version of VC++ you’ve actually got, that is.
I have absolutely no idea how this constitutes progress. It’s clear that many of these modern tools are only effective/efficient/useful at all on Linux platforms. It’s not clear that they would save significant time even then, compared to just downloading the latest release of the tools I actually wanted (there were only four of those, remember, or five if you count one instance of RequireJS).
And here’s the big irony of the whole situation. The only useful things these tools actually did, when all was said and done, were:
– Install a given package within the local directory tree for my project, with certain version constraints.
– Recursively install any dependent packages the same way.
That’s it. There is no more.
The only things we need to solve the current mess are standardised, cross-platform ways to:
– find authoritative package repostories and determine which packages they offer
– determine which platforms/operating systems are supported by each package
– determine the available version(s) of each package on each platform, which versions are compatible for client code, and what the breaking changes are between any given pair of versions
– indicate the package/version dependencies for a given package on each platform it supports
– install and update packages, either locally in a particular “virtual world” or (optionally!) globally to provide a default for the whole host system.
This requires each platform/operating system to support the concept of the virtual world, each platform/operating system to have a single package management tool for installing/updating/uninstalling, and each package’s project and each package repository to provide information about versions, compatibility and dependencies in a standard format.

Also an interesting debate, elsewhere in the comments, regarding Nix versus Guix:

the language is pretty much designed for writing JSON-style configuration except as a formal programming language, which is what the vast majority of Nix code is (both package definitions and system configurations).

Guix uses an embedded domain specific language that is also designed for easily writing package recipes, but it uses s-expressions instead of something that is “JSON-style”. Also, Nix build scripts are written in Bash, whereas Guix build scripts are written in Scheme. I think that makes Guix more consistent in its programming style.

Additionally, with Nix, you can be close to certain that if you build something twice, you’ll get the same result, because it can’t access impure resources.

Guix has this same certainty because it uses the Nix daemon, and the defaults are a bit stricter than Nix.

Finally, because Guix is a GNU project, the official repositories are going to go nowhere near non-free software.

That doesn’t mean that you can’t host your own non-free packages or use someone else’s non-free packages. But yes, Guix does not ship with packages that ask the user to give up their freedom. To me, that’s an advantage.

Post external references

  1. 1
    https://news.ycombinator.com/item?id=8226139
Source