Preferences

I think there’s a much stronger argument for policies that both limit the number and complexity of dependencies. Don’t add it unless it’s highly focused (no “everything libraries” that pull in entire universes of their own) and carries a high level of value. A project’s entire dependency tree should be small and clean.

Libraries themselves should perhaps also take a page from the book of Linux distributions and offer LTS (long term support) releases that are feature frozen and include only security patches, which are much easier to reason about and periodically audit.


I've seen this argument made frequently. It's clearly a popular sentiment, but I can't help feel that it's one of those things that sounds nice in theory if you don't think about it too hard. (Also, cards on the table, I personally really like being able to pull in a tried-and-tested implementation of code to solve a common problem that's also used by in some cases literally millions of other projects. I dislike having to re-solve the same problem I have already solved elsewhere.)

Can you cite an example of a moderately-widely-used open source project or library that is pulling in code as a dependency that you feel it should have replicated itself?

What are some examples of "everything libraries" that you view as problematic?

Anything that pulled in chalk. You need a very good reason to emit escape sequences. The whole npm (and rust, python,..) ecosystem assumes that if it’s a tty, then it’s a full blown xterm-256color terminal. And then you need to pipe to cat or less to have sensible output.

So if you’re adding chalk, that generally means you don’t know jack about terminals.

In the Python world, people often enough use Rich so that they can put codes like [red] into a string that are translated into the corresponding ANSI. The end user pays several megabytes for this by default, as Rich will also pull in Pygments, which is basically a collection of lexers for various programming languages to enable syntax highlighting. They also pay for a rather large database of emoji names, a Markdown parser, logic for table generation and column formatting etc. all of which might go unused by someone who just doesn't want to remember \e[31m (or re-create the lookup table and substitution code).
Exactly! ANSI escape codes are old and well defined for all the basic purposes.

Pulling in a huge library just to set some colors is like hiring a team of electrical contractors to plug in a single toaster.

Some people appreciate it when terminal output is easier to read.

If chalk emits sequences that aren't supported by your terminal, then that's a deficiency in chalk, not the programs that wanted to produce colored output. It's easier to fix chalk than to fix 50,000 separate would-be dependents of chalk.

I appreciate your frustration but this isn't an answer to the question. The question is about implementing the same feature in two different ways, dependency or internal code. Whether a feature should be added is a different question.
Chalk appears to be a great example.

I wonder how many devs are pulling in a whole library just to add colors. ANSI escape sequences are as old as dirt and very simple.

Just make some consts for each sequence that you intend to use. That's what I do, and it typically only adds a dozen or so lines of code.

The problem isn't the implementation of what I want to do. It's all of the implementations of things I never cared about doing. And the implementation of what I want to do that is soooo much more complex than it needs to be that I could easily have implemented it myself in less time.

The problem is also less about the implementation I want, it's about the 10,000 dependencies of things I don't really want. All of those are attack surface much larger than some simple function.

Most of your supply chain attack surface is social engineering attack surface. Doesn't really matter if I use Lodash, or 20 different single-function libraries, if I end up trusting the exact same people to not backdoor my server.

Of course, small libraries get a bad rap because they're often maintained by tons of different people, especially in less centralized ecosystems like npm. That's usually a fair assessment. But a single author will sometimes maintain 5, 10, or 20 different popular libraries, and adding another library of theirs won't really increase your social attack surface.

So you're right about "pull[ing] in universes [of package maintainers]". I just don't think complexity or number of packages are the metrics we should be optimizing. They are correlates, though.

(And more complex code can certainly contain more vulnerabilities, but that can be dealt with in the traditional ways. Complexity begets simplicity, yadda yadda; complexity that only begets complexity should obviously be eliminated)

I think AI nudges the economics more in this direction as well. Adding a non-core dependency has historically bought short-term velocity in exchange for different long-term maintenance costs. With AI, there are now many more cases where a first-party implementation becomes cheaper/easier/faster in both the short term and the long term.

Of course it's up to developers to weigh the tradeoffs and make reasonable choices, but now we have a lot more optionality. Reaching for a dependency no longer needs to be the default choice of a developer on a tight timeline/budget.

Let's have AI generate the same vulnerable code across hundreds of projects, most of which will remain vulnerable forever, instead of having those projects all depend on a central copy of that code that can be fixed and distributed once the issue gets discovered. Great plan!
You're attacking a straw man. No one said not to use dependencies.
At one stage in my career the startup I was working at was being acquired, and I was conscripted into the due-diligence effort. An external auditor had run a scanning tool over all of our repos and the team I was on was tasked with going through thousands of snippets across ~100 services and doing something about them.

In many cases I was able to replace 10s of lines of code with a single function call to a dependency the project already had. In very few cases did I have to add a new dependency.

But directly relevant to this discussion is the story of the most copied code snippet on stack overflow of all time [1]. Turns out, it was buggy. And we had more than once copy of it. If it hadn't been for the due diligence effort I'm 100% certain they would still be there.

[1]: https://www.hackerneue.com/item?id=37674139

Sure, but that doesn't contradict the case for conservatism in adding new dependencies. A maximally liberal approach is just as bad as the inverse. For example:

* Introducing a library with two GitHub stars from an unknown developer

* Introducing a library that was last updated a decade ago

* Introducing a library with a list of aging unresolved CVEs

* Pulling in a million lines of code that you're reasonably confident you'll never have a use for 99% of

* Relying on an insufficiently stable API relative to the team's budget, which risks eventually becoming an obstacle to applying future security updates (if you're stuck on version 11.22.63 of a library with a current release of 20.2.5, you have a problem)

Each line of code included is a liability, regardless of whether that code is first-party or third-party. Each dependency in and of itself is also a liability and ongoing cost center.

Using AI doesn't magically make all first-party code insecure. Writing good code and following best practices around reviewing and testing is important regardless of whether you use AI. The point is that AI reduces the upfront cost of first-party code, thus diluting the incentive to make short-sighted dependency management choices.

Won't using highly focused dependencies increase the amount of dependencies?

Limiting the number of dependencies, but then rewriting them in your own code, will also increase the maintenance burden and compile times

A lot of projects are using dependencies, but are only using a small part of. Or are using them in a single place for a single usecase. Like bringing in formik (npm), but you only have one single form. Or moment, because you want to format a single date.
The lower level the dependency is, the more unjustifiable it is for it to have its own dependencies. This ought to be a point of competition between libraries and often is, at least in the c++ world
I'd be willing to pay $100 to upvote your comment 100x.
How do you think dang puts bread on the table?

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal