We’ve gotten a lot of good out of agile development methodologies. I’m never a “One True Way” kind of person, but agile has a ton of benefits that development needed (and Ops increasingly needs). Extending that to the Dev side of DevOps is painless, since much of agile is wrapped in DevOps clothes these days.
From my perspective, the most impactful bit to come from Agile was test driven development. By writing the test to fulfill the requirement, then writing the code to pass the test, our small team (I was on a team of developers with no Ops people supporting us—or more accurately, we were the Ops team when I first used TDD) needed the assistance it provided. Without dedicated QA staff, we were able to perform as rigorous a test suite as our changes warranted, simply by running the tests that we’d used for development.
Of course, this is old news to some of you, TDD has largely been consumed in the larger Agile/DevOps movements and you’re at it daily. But the shift to having developers do more of their own testing is something that I’m still pretty iffy about, to be honest.
Devs Aren’t Necessarily Testers
While it can work, devs certainly can understand the weak points in their code, I would argue that the person writing the code is not the correct person to write the tests for it. Particularly as you get into more the integration phase of testing, and the complexity of code inter-relationships heats up. Some developers will do well at it. My experience is, many will not. Why? Because if they don’t know of a weakness in their solution when writing the solution, they won’t know of the weakness when writing the test case. And it is a truly rare shop that has 100 percent code coverage for testing, so those weaknesses could end up being time bombs.
Simply put, QA needs to work with Devs to write the tests. Two heads are indeed better than one when looking for faults with solution designs. But the reality of development often doesn’t allow for this type of cooperation. When you’re on a sprint and this one bit is slated for half a day, setting aside time to get together can be difficult for a problem the developer has already written the test for and solved in their head. After all, the whole point of Agile and DevOps was to speed delivery, and making the simple more complex isn’t generally part of that equation.
Have a Plan, and Use it
I know of a couple of shops where integration testing is pretty much an afterthought. One has a QA team that fights to make certain it is done and done well (though still as an afterthought), rather than cobbled together as a PoC (I was thinking Proof of Concept; take either use of that acronym, both are accurate) not too long before release. The others muddle through, figuring out what doesn’t work in fits and starts (too often in production), rather than following a stream. The result is, unsurprisingly, missed deadlines and ever-extending sprints.
For all of the good that DevOps in general, and CI/CD in particular have brought us, there are a few areas—one of them being testing—where speed and/or quantity has replaced quality. We run tests a LOT. But are they the right tests? For unit tests in a TDD environment, yes—for lines that have test coverage. Outside of that narrow space, I think the answers are much more iffy, and it depends upon a lot of things.
The best hedge is to have QA involved in writing the tests for TDD environments, even for small bits. It’s not an overhead if it shortens the overall timeline to delivery; even if it slows a piece here or there, it’s the timeline to finished product that really matters, not the timeline of a given piece of it.
If you don’t have a QA team—as many shops do not—I recommend devs doing it for each other. That’s not to say that QA is not needed, but it does utilize the “Two heads are better than one” mantra, and keeps the ideas flowing. Otherwise, the Ops members of a DevOps team are great for that. Historically, production woes fell on them, so no doubt they’ve honed their skills in detecting problem areas.
Don’t Be That Person
No matter how you go about it, do not go light on the testing. Do TDD, and keep it up. Make every change request start with test modification, and then fix what fails. Document the tests, since they’re code also, and stressing what is being tested and why is as important as the code talking about what it’s doing and why.
After all, you don’t want to end up with the difference between ‘==’ and ‘>=’ placing you in the headlines for a week, mostly with bad news. That difference is an obvious TDD test case if ever one was written; don’t be that person.
Continuous testing works, test driven development works. You just have to think about it when implementing them, and stick to a plan that tests beginning to end. It’s about quality code quicker, not about any code quicker. Note that I’ve been focused on the process end of testing, not the smoke/load/regression end. Those are important too, but their issues are different, so I opted to throw them into a separate blog.
So go forth, iterate quickly, but test thoroughly. I doubt most of us will achieve 100 percent code coverage anytime soon, let alone 100 percent integration nested coverage … But test what’s important, and bring what’s important to test.