How do you balance the need to “go fast” with the need to test everything and deliver high-quality software?
With applications the driving force in today’s economy, the quality and release cadence of your software are critical to your business and your bottom line. You want to get software updated in the hands of your end users as quickly as possible. However, even the fastest release cycle wouldn’t make a difference if your product ends up being buggy or malfunctioning, resulting in a bad user experience or service interruption.
I want to take a look at one of the most important (if not the most important) aspects of the software delivery process: testing! As software teams become more mature in their implementation of DevOps and continuous delivery to streamline their software delivery, testing takes a central role as a prerequisite to these practices.
You need to have testing on your mind from the very beginning, and plan your testing as part of the initial stages of designing your software—mapping your test procedures, infrastructure configuration and acceptance criteria. As you design and develop your app, you also need to think about how you’re going to test it. The more testing is on your mind, the more you’ll have a solid understanding of what you need to test, what’s the definition of “done,” and the faster, more streamlined, your overall process will be.
Let us review four questions every development or QA team should consider when planning their testing, as they try to balance quality with speed—managing and accelerating their testing, while ensuring a reasonable time-to-market.
How Much Testing is Enough?
The age-old question: How much testing do I really need to do? Unfortunately, there is no concrete answer. But with a bit of initial legwork, you can get a good feel for how much testing is necessary.
The amount of testing that is necessary is influenced largely by the industry you’re in. If you are testing mission-critical aircraft software, let’s hope it’s very thoroughly tested! On the other hand, if you’re building a game app, the amount of necessary testing may be lower. The truth is when you run tests, the success rate is usually not 100 percent. Therefore, the key is to define what, for your specific piece of software, can reasonably be considered “done.” Instead of arbitrarily deciding at some point that your release is ready to ship, make the decision before the fact: What are we going to accept? What level of pass/fail are we willing to roll out with? What are we comfortable with based on what we have done in the past?
The amount of testing required can sometimes be a fluid thing. A good way to measure your comfort level with your testing procedures and the amount of testing you’ve done, is to ask yourself, “If it was now 11 o’clock at night, and I needed to ship this code to my end users, am I sufficiently comfortable rolling this out, or am I not sure how it’s going to behave in the wild?” If you’re leaning toward the latter, than it’s probably an indication that you should have done some more testing. That’s a gut feeling and depending on the industry, the point at which you’ll fire off a release and not have any worries can vary greatly. And, obviously, for some regulated industries, you’ll need to prove compliance and ensure you’ve recorded that all tests were successful before you can even start thinking about releasing anything.
What Kind of Testing Should You Invest in?
Decide what kinds of things you really care about and what you need to verify and test for. What are the requirements that we expect from the application, and from the different stakeholders in this project?
Every piece of software needs to be tested differently. Because of this, you have to plan what the testing will be. You can’t just say, “I’m going to unit test the heck out of my code, each and every time, no matter what.” Sometimes, unit testing isn’t the most important test you need to be running.
The investment has to be in automation. No matter what type of tests you decide are most functional and relevant to your project—whether you’re doing a ton of unit testing or a bunch of functional stress tests—it needs to be automated.
Particularly, as you open up your services to the world, you need to put into place the relevant performance and reliability tests so that if suddenly your load increases by 200 percent, you have a plan for how to handle that. Are you going to autoscale? Can you autoscale? And finally, do you even need to auto-scale?
All of those questions need to be answered as part of your development cycle. Automating your tests will let you not only create tests for these scenarios, but it will let you run those tests over and over. By automating your tests, you can establish a baseline and see where you deviate.
It can also be helpful to think of testing in a different way. Don’t look at testing as a tool to validate technical requirements, but as a way to make your customers happy. You want to make sure your tests are directly contributing to your customer’s overall experience. So start with the highest-level tests that look at things such as overall business functionality—and make sure the software works at the business level.
How Can You Make Testing Faster?
Automate, automate, automate! That might seem obvious, but try to think outside the box. Find things you thought couldn’t be automated and wrangle them into shape. Remember that every minute that one of your developers is sitting there running a test that could be automated, it is costing you time (and money) that he or she could have spent on more productive things. In the long run, not automating can cost much more than making the commitment to doing it right.
Other simple things you can look at to increase the speed of your testing is to make sure your development environments are mirror images of your production environments, and that you have fidelity of configuration throughout the lower and higher environment.
It is also helpful to make sure you’re defining user requirements that are thin vertical slices, instead of trying to test huge cumbersome pieces of functionality. The ideal situation is if you can run only tests against components that have changed. Componentization of both your application and your testing procedures is critical because you do not want to have to rebuild and re-test everything all the time. Remember, also, that if you make incremental changes to your code and frequently release them to your users (Agile, anyone?), you will get just enough feedback to improve quickly and reliably. Be cognizant of not biting off more than you can chew.
Segregating the tests to several phases is a good way to accelerate your CI and test cycle—but make sure that both your application, build and test batteries are designed for that. With this approach, you are essentially running multiple phases of CI, starting with the fastest cycles that require the smallest build(s), with minimal dependencies, and move on to the outer/cascading cycles of tests. Following a submit, run a minimal build, so that you do not need to wait—sometimes many hours—to run an extensive set of all your tests with every single commit to see if it passes. Start with unit tests to minimize the build and provide fast feedback, then in various junctions incorporate additional tests, such as integration tests, installers and usability and performance tests.
Another way to streamline your testing process is to take a look at the entire scope of your build and test cycle and go after the most inefficient stage. After automating your manual processes, look at the bottlenecks and what takes the most time and figure out how you can optimize that specific process. Often the answer is parallelization, but other times it is hardware. Iterate on your process and figure out where you can gain the most ground!
Lastly, parallelization and distribution of the build and tests provide a huge time saver, so consider running tests across a cluster of CPUs to cut down the run time.
Where Do You Start if You Want to Make Sure All Your Builds Are Green?
Ask yourself, What are you trying to achieve? Is it a process where you can release code more quickly? Or where you can address bugs more readily? Then make sure you’re intimately aware of what you have in place. You need to know every step, task, process and tiny bit of your testing procedures and supporting infrastructure. Draw your entire end-to-end workflow on a whiteboard and find the bottlenecks that are slowing it down.
Once you’ve defined your objectives and are aware of your current state, you can put the relevant tools in place—from development environment to your testing frameworks all the way to post-dev tools such as your analytics tools—design the right testing pipelines and hire the relevant people with the relevant skills. Don’t insist on making all the mistakes yourself (tongue firmly in cheek), but find folks who have already made the mistakes and learned from them—and hire them!
Also, give your team a great system that gives feedback fast. Do frequent code reviews so there is accountability and a constant learning process.
Consider the following best practices for improving your testing processes:
- Be sure to plan what your testing approach will be ahead of time if possible. When you’re designing your software, designing how you’re going to test it is important.
- The amount of testing you need to do is dependent on what industry you’re in and your specific use case.
- Consider what level of test pass/fail you’re comfortable with, based on what you’ve done in the past.
- Consider pre-commit or pre-flight builds to ensure your tests are more likely to pass once checked to Mainline
- Define your requirements for each test cycle
- Segregate your tests to “fast” and “slow” and decide which battery of tests you need to run every time, and which processes you only need to run at certain junctions.
- If you hire well and bring new skills to your team, your process will get better and faster.
- Consider what problems your customers are facing and perform an analysis. Use whatever tools are best to test your software like a customer would to achieve the highest end user satisfaction.
- Automate manual steps that you take for granted to accelerate cycle times and improve productivity. Speed comes second.
- Iterate on the testing process, and not just on the code itself. You should always be looking for new ways to test.
- Test your test processes. Hold regular meetings with your team to reassess if your testing strategy is working for you.
- Enact code reviews so that there is accountability with the rest of the team.
As you map, design and optimize your tests, remember that your testing policies don’t need to be set in stone, and that you can—and should—adjust and tweak as you go.
Happy testing!
About the Author/Anders Wallgren
Anders Wallgren is chief technology officer at Electric Cloud. Anders has more than 25 years’ experience designing and building commercial software. Prior to joining Electric Cloud, he held executive positions at Aceva, Archistra, and Impresse and management positions at Macromedia (MACR), Common Ground Software, and Verity (VRTY), where he played critical technical leadership roles in delivering awardwinning technologies such as Macromedia’s Director 7. Anders holds a B.SC from MIT. He invites you to download the community edition of ElectricFlow to build your pipelines and deploy any application, to any environment for free.