At the DevOps World | Jenkins World 2019 conference, OverOps announced it has integrated its namesake tools for analyzing Java and Microsoft .NET code at runtime with a variety of continuous integration/continuous deployment (CI/CD) platforms, including Jenkins, JetBrains TeamCity, Atlassian Bamboo and Pivotal Concourse.
Eric Mizell, vice president of solution engineering for OverOps, noted that the more errors that are discovered by DevOps teams early in the development cycle, the less technical debt organizations must address after an application is deployed in a production environment. By pushing testing to further to the left of the application development process, the less troubleshooting will be required later on, he said.
OverOps employs predictive analytics and machine learning algorithms to identify new, increasing, resurfaced and critical errors in a release. It then generates a code quality report that ranks all severe issues likely to impact end users and application functions. By integrating OverOps within the CI/CD pipeline, DevOps teams can use those reports to determine whether code is safe to promote or automatically block unstable releases from moving forward.
In addition, because the machine learning algorithms surface the true root cause of an issue, DevOps teams can more efficiently remediate issues by routing code back to the development team responsible for all the issues that are being created downstream.
Making testing an integral part of the CI/CD process not only tightens the relationship between developers and testers, but also in many cases enables developers to test their own code before submitting it for review.
Mizell said the need to shift testing to the left has become more critical as customer tolerance for application errors is low, especially in use cases involving mobile applications—consumers are especially unlikely to give a mobile application a second chance after they’ve encountered an issue.
In an ideal world, of course, testing tools that make use of machine learning algorithms would be employed to coach developers in a way that enables them to develop reliable code more consistently and faster. However, some organizations will use tools that leverage machine learning algorithms to identify developers who are not meeting expectations. That weeding out process is likely to begin much earlier in the application development life cycle as more advanced testing tools are embedded within CI/CD environments. Those tools won’t eliminate the need for developers and testers, but they do promise to increase dramatically the rate at which quality applications can be built and deployed at an industrial scale. Technical debt may never be completely eliminated, but at the very least the rate at which that technical debt is being accumulated should substantially decline.
It may be a while longer before machine learning algorithms and other forms of artificial intelligence (AI) are employed pervasively across DevOps processes. In most cases, those algorithms will help developers write better code faster. However, it’s also apparent there will be other algorithms at work that one way or another ensure developers will have fewer opportunities to make mistakes in the first place.