One colleague that I worked with recently quit their career as a programmer over a hatred of tests. Over some beers they told me "I just can't write another one!". I personally thought they were a great programmer with lots of potential but testing is an important part of programming.
As a more senior programmer, I can see the merit of testing, but even when I was younger I did not care for TDD as it felt unrewarding in terms of the effort to product output. I've never had a non-technical manager tell me "nice tests".
Do you know any ways of making testing more fun or enjoyable especially for younger developers? Either in the way we look at testing from an institutional level or the way we implement them at the code level? When is testing too much?
...
There are two questions we need to ask ourselves:
Why should we write tests?
Why are we told to write tests?
The first question has to do with someone's progression in mastering the field of software development. The second stems from the managerial addiction to metrics combined with the product of reductive reasoning that is thinking that more tests mean fewer bugs.
No developer truly wants to write tests. They do it either because they know it's the right thing to do or they are forced to by management. Where things get confusing is that no piece of software needs to have a test. We write a test because it is appropriate, the implication of which is that it is not always necessary to write a test.
The idea that not all of the code you write needs tests is blasphemy to the testing purists, but the opinion of any purist is heavily biased towards them being right while everyone else is wrong. To develop the skill of writing useful tests, you must first stop listening to the purists, or they will turn you off from testing entirely.
Why we should write tests is easy enough to understand: we write a test today to save time tomorrow. If writing a test won't save us time later, it shouldn't be written. Tests come with a cost: a cost to write and a cost to maintain. There has to be an expected return on investment to cover that cost. Be it saving time on a future major refactoring, avoiding a production crisis that takes days to troubleshoot, or eliminating the need to train new developers on a new codebase; there must be some future time-savings.
Should you write a lot of tests while you are learning how to write tests? Yes, of course - how else can you learn how to write tests correctly? Do you have to write a lot of tests after you are skilled at writing tests? It depends on your specific situation. Sometimes, you need a full suite of unit, integration, and functional tests, and sometimes you don't. Knowing what tests are appropriate, and when and if they should be added, is something that comes with experience after writing thousands of tests over many years. Once you get that experience, however, you will develop the judgment to know where and how to write which types of tests to get the best return on your investment.
Ultimately, a developer who writes tests is lazy - which is a good thing. They do not want to be bothered with more work than they have to in the future, so they invest a few minutes today to save a few hours, days, or weeks later. They don't want to feel anxious during production releases, or when a new novice developer starts committing to their codebase. They want to be able to refactor whatever they want whenever they want and know that everything still works perfectly. Lazy developers are good developers, and lazy developers work the hardest on writing tests.
Why we are told to write tests is an entirely different matter. On the surface, being asked to write tests is simply an extension of writing the appropriate tests that give you the best return on investment, thus allowing you to remain a lazy developer. The real reason, unfortunately, is rarely ever this innocuous.
Managers at all levels love metrics. Metrics make a manager feel smart, powerful, and in control. If software development were like factory work, this would be a good thing. Measuring things like products produced per unit time and their associated rate of defects is incredibly valuable when trying to create thousands of the same thing as efficiently as possible. Managers know this, and academic business education programs teach this as an immutable law of management. After all, without metrics, how can we tell if we're doing anything at all?
The software industry's addiction to metrics is the cause of many counter-productive practices that software developers suffer through every day. The demand for estimates, daily code commits, and the number of lines of code committed is all seen as valuable metrics managers can feed into charts to show off in their quarterly status meetings to executives. After all, executives are pleased when charts show positive change, so having pretty graphs is a sure way to get recognized for future promotions. This mentality is primarily to blame for the demand for developers to write tests because tests have an irresistible characteristic to managers: they yield metrics. Both the number of tests and the amount of code coverage are considered by executives to be strongly correlated to quality and stability, and no one is going to correct their misunderstanding.
Imagine for a moment if a competent technical manager truly wanted to decrease the number of bugs a system produced. Would they tell their developers to write tests, or would they perform a root cause analysis and fix flaws in the system so that bugs of that type could no longer be produced? The opposite approach is irrational. Why would we write tests around a part of the system that is known to be the systematic root cause of many bugs of the same type? As developers, we address the bugs not by focusing on their detection, but on their prevention. Metric addicted managers, however, don't care about bug prevention, only line charts that show progress towards a goal - no matter how flawed the underlying logic of the goal is.
For developers who are sick of writing tests, I would recommend first learning how to write the most appropriate tests for your specific situation that gives you the best return on investment. Your motivation will then be to write tests because it is in your best interest, not because you are forced to by management.
For developers forced to write tests by management, I would consider finding a new place to work, and not leaving the field of software development entirely. The software development industry is very young, and we're still figuring things out. One of the things we're figuring out is testing, and we're only at the beginning of that journey. Whenever we are just starting to learn something, we are novices and need to be told the right and wrong way to do things, which purists are more than happy to do. Over time, however, managers will have to acknowledge that software development is not like factory work, and things will start to change for the better. Until that time, we have to be patient and invest in ourselves as individuals as we wait for the industry to improve. The good news is there is a lot to like about being a software developer, and so long as we work in a healthy environment with a reasoned approach to testing, there is still a lot to enjoy.
...
Neil writes a lot more about the softer side of software development over at neilonsoftware.com. Check it out!