Whenever we’re pushed for time and forced to rush, we know we’re entering an unspoken deal with the devil. Whether it’s cooking the family a last-minute dinner, ironing a shirt before a wedding or writing against a strict deadline, we’re acutely aware it’s highly unlikely the output will be a fair reflection of our best efforts.
Seldom does “hurry” equal “success”; mistakes will inevitably happen–that’s all part of the age-old mission to maintain the equilibrium between speed and quality. But what happens if the work, such as enterprise software delivery, is perpetual by its very nature and the only goal is to deliver a continuously improving experience to discerning end-users? How can such a compromise exist in such a fast-paced environment?
It’s a challenge, that’s for sure–especially for software delivery teams at traditional large organizations. Unlike tech giants and other digital natives that were conceived around the very idea of a business model driven by incredible digital experiences, mature enterprises–your banks, insurance companies, retailers and so on–are learning on the fly, and serve and support vast product portfolios from across the business.
That doesn’t mean they can’t compete–far from it. They just need to rethink how product-related work flows across their value streams to better empower their talented test and quality assurance teams who, with the right information available in real-time, are able to ensure customers are regularly receiving high-quality products in a timely manner.
Trust in Your Test and QA Teams
In most instances, enterprises can dramatically enhance the quality and speed of their software delivery without heavily investing even more resources into their test/QA organizations. Their testing and QA teams can create order out of chaos, and are in a unique position to understand and assess the health of the product’s development and delivery to answer questions such as:
- Are all the requirements fulfilled?
- Are their implementations defect free?
- Are the known defects tolerable if they’re deployed into production?
In many organizations, it’s up to the testing team to declare whether a product is ready, or if they must stop-ship. To effectively handle this responsibility, the testing team must have access to the right information at all times. It was once hoped that test management tools could be the single source of truth (at least when it came to managing defects).
Such a dream turned out to be dead in the water because all other specialists across the software delivery value stream use their own best-of-breed tools to create and manage their work (represented as artifacts such as requirements, features, epics, stories and so on).
And because these specialized tools do not naturally integrate and automatically flow information, testers (and QA teams) end up on time-consuming scavenger hunts to find the most current information—which dramatically undermines their ability to quickly and accurately determine the health of any product. It’s a vicious cycle.
There’s More to Product Development than Defects
Each specialist in the software delivery process has their own point of view on the shared artifacts. Testers need access to requirements to create tests. But the business analysts, product managers and product owners that write the requirements do not use the test management system, preferring their own purpose-built tools for developing requirements and user stories.
Similarly, project managers use PPM tools, while IT support uses ITSM tools. And so on for each discipline across the key stages of the value stream from ideate and create to release and operation. While each of these disciplines are linked together by their desire to create and deploy the highest quality product possible, their ability to work together is hampered by the fact that their tools do not work together. And when these teams don’t work well together, it’s end users and the business that suffer.
Knowledge-Sharing Across the Value Stream
Getting these tools to work together can be accomplished in several ways:
- A Manual Process: such as logging into another team’s tool, emailing a colleague for the information or consulting a spreadsheet.
- Batch Data Transfer: exporting defects and requirements from one system and bulk importing them into another. Of course, artifacts loaded this way have only static information; they are not continually updated with the most current status.
- Automated Integration Across Toolchain: If these systems are integrated and the information in them is continuously synchronized, everyone has access to the most up-to-date information in real-time. Practitioners can even collaborate with one another via this information, all while remaining in their tool of choice.
The third way is the only way to obtain the end-to-end visibility and traceability that is required for the vital one source of truth into the health of your software products. Only then can you access a single collection of consolidated data to analyze and optimize the quality of the software delivery process.
By connecting these tools and synchronizing all product-critical data across the value stream, the QA process is strengthened and streamlined. For instance, when a tester creates a defect in a QA tool, it’s automatically updated in the PPM tool and the Agile Planning tool. The same is true for user stories, requirements and test plans—a free-flowing, always up-to-date stream of information that makes or breaks the speed and quality of a product’s development.
A Single Well-Oiled Machine for Quality Control
True value stream integration doesn’t just enhance alignment between testers, developers and business analysts, it empowers all specialists involved in planning, building and delivering a software product. Business analysts, product managers, product owners, developers, architects, testers, system administrators, project managers, scrum masters and help desk professionals are operating within a single unified system by sharing and collaborating on the same information.
With the test management system synchronized to other systems, the testing team gains a centralized view on a product’s health.
Testers
- Have access to product information in their test management system.
- Have access to the most up-to-date information, eliminating costly rework due to inaccurate information.
- Can collaborate on artifacts by updating and augmenting them in their tool, knowing their updates will be automatically reflected in all other systems syncing that artifact.
- Can better leverage the features in their tools when those features rely on artifacts created by other teams—such as automatically creating tests from requirements.
Testing Teams and Managers
- Gain better visibility into their products by receiving reports that pull data from multiple systems, rather than having a siloed view of their product data.
- Get more collaborative, efficient teams. Practitioners no longer use ineffective email threads to communicate, they collaborate on common artifacts in their tool of choice.
- Achieve a strong return on investment—each practitioner can save 20 minutes each day by automating manual processes, which can add up to millions a year in productivity savings and avoiding wasting precious human intellect.
Moreover, with a fully integrated value stream, the QA and testing teams can institute standardized processes and tools, without being undermined by the tools and processes of other organizations. The centralized test management system can become the single pane of glass into a product’s development, providing test managers and executives with the insight into the chaos, allowing them more control and better decision-making. And, crucially, ensuring products are not only delivered on time but that they’re also continuously delivering delightful end-user experiences. No more deals with the devil, just the benefits of knowing the devil in the details.