Preferences

Exactly my experience. Numerous times I found ways to reduce accidental complexity in software. But guess what, my change breaks 50 tests, because accidental complexity happened to be under test. What I anticipated to be an enjoyable 2-hour refactoring turns into 3-day chore. I guess I'll just leave the complexity in there.

Sure, those were "bad" tests. Accidental complexity is not supposed to be tested. And yet I had this experience in every single company I worked at. Maybe I'm just incredibly unlucky. Or maybe there's something wrong with "tests are good" mantra.

"Write tests. Not too many. Mostly integration"


Tests make refactoring slower, because you might have to also update the tests. On the other hand, in a complex enough code base, refactoring is impossible without sufficient test coverage. You can rely on the tests to catch edge cases that you weren't aware of in areas of products built on top of architecture you are maintaining.

Sometimes, all 50 of those stupidly over-complex tests that people have copy-pasted around are garbage that get in the way of your refactor, but sometimes one of them is protecting some important functional requirement that you didn't think about like "Test that this code called from a tight loop doesn't query the db hundreds of time which would lead to the startup time taking minutes rather than seconds".

> Tests make refactoring slower, because you might have to also update the tests

No, tests can make refactoring slower, because they often test implementation detail instead of interface. Being able to distinguish the two is not as easy as it sounds, typically it requires deep understanding of the domain, which people writing tests might not have yet.

> in a complex enough code base, refactoring is impossible without sufficient test coverage

Sure, and in my experience complex code bases are typically complex due to the amount of accidental complexity, not because they actually solve complex problems. Like layers of overengineered classes intertwined through dependency injection for the sake of being more... testable. See the irony?

And indeed, refactoring becomes quite hard. You have two options to make it easier:

1. Identify and remove accidental complexity.

2. Cover it with tests and set the complexity in stone.

It's like fighting cancer with painkillers. It might look like it's working, but you going to die pretty soon.

> Test that this code called from a tight loop doesn't query the db hundreds of time which would lead to the startup time taking minutes rather than seconds

Oh, this is a great example. I saw the exact same thing implemented couple years ago. The whole team hated it because it caused tests to break constantly. And yet nobody on the team had the balls to remove it, and they were even following the pattern for new tests, because past tests had it.

If startup time matters for your app, there will be plenty of signals apart from tests that it's broken. If startup time doesn't matter, why are you testing it?

By the way, you haven't even noticed it, but you conflated startup time with number of queries. You're testing the wrong thing. What if hundreds of queries are still blazingly fast (due to being simple and/or cacheable) and have negligible effect on startup time? What if existing startup time is already extremely high (e.g. you're loading ML models in memory)?

If startup time truly matters for you, you shouldn't be testing it, you should be monitoring it. And I'm not even talking about some complex monitoring systems. When someone on your team says their developer experience got worse, that's the type of alert you need to listen to.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal