The Economic Case for JUnit Max
In preparing to relaunch JUnit Max, I’ve been try to articulate exactly why it is worth the price. I’m still conflicted about charging for Max, although realistically if I had no chance to be paid for it I couldn’t afford to work on it. If I’m going to charge money, though, I’d like to know that Max is worth it.
For me this isn’t an issue. I run tests all the time and I appreciate the time savings and additional focus Max gives me. Of course, I had to give up the things that the time I spent implementing Max could have brought, but that’s a sunk cost for me. I simply like programming more when I have Max, which is a big part of my motivation in the re-launch.
If you’re out there with $100 in your pocket, though, I can imagine that you need convincing. If you’re going to ask someone else for the $100, then they need convincing. So, does Max make economic sense?
Here’s an envelope calculation based entirely on time savings. I’ll take the JUnit unit tests as a baseline. They take about 10 seconds to run. While we’re programming we run them ~50 times per hour. Scaling this up to a full-time job, we would be spending 10 seconds/run * 50 runs/hour * 1000 programming hours/year = 139 hours/year waiting for tests to finish. At $200/hour, your employer is paying $27,000 for you to wait for test results.
With Max, wait times are reduced because of test prioritization. With the JUnit test suite we get results in 2 seconds instead of 10. Even if the savings were only 50%, though, Max would still be worth $13,000/year (and the savings on longer-running tests suites will be larger than the 80% we get for the JUnit tests). In other words, Max pays for itself roughly every two working days.
And that doesn’t count all the times I stay focused on programming because it’s only two seconds instead of getting distracted during a 10-second or 30-second pause and not getting anything done for several minutes. I can only conclude that at $100/year Max is seriously underpriced. My conscience is assuaged.