BMC this week added a BMC AMI DevX Code Insights tool that makes use of machine learning algorithms to break apart monolithic COBOL mainframe applications into subprograms that are easier to update and refactor.
In addition, BMC is now making available BMC AMI zAdviser Enterprise, a commercial instance of a previously announced free tool that uses machine learning algorithms to collect telemetry data from mainframe applications. The commercial version of the tool adds a dashboard built and maintained by BMC that makes it simpler to consume that data.
John McKenny, senior vice president and general manager of Intelligent Z Optimization and Transformation at BMC, said BMC AMI DevX Code Insights visually maps a mainframe application to analyze data flow and debug applications made up of millions of lines of code in real-time that would otherwise be too complex to unravel. Armed with the insights gained, IT teams can then isolate business logic, refactor subprograms, add application programming interfaces (APIs) or remove dead code as they see fit, he added.
BMC AMI DevX Code Insights also makes it simpler for organizations to onboard new developers to a project at a time when many of the original developers of COBOL applications are retiring, said McKenny.
In the coming year, BMC is also committed to adding generative artificial intelligence (AI) tools to its portfolio for reengineering mainframe applications, added McKenny. Those tools, for example, will make it simpler to create documentation for legacy applications, he noted.
It’s not clear how quickly organizations that have mainframes are looking to re-engineer applications, but as more of them adopt DevOps best practices, they are being updated more frequently. The less monolithic the application, the simpler that task becomes. The issue is that mainframes tend to run large applications that access large amounts of data, so many IT teams are naturally fearful of breaking them if they are reengineered, noted McKenny.
While there is no doubt that the process of reengineering applications will become more automated in the years ahead, software engineers will still be needed to manage the overall process, said McKenny. The role of the software engineer will undoubtedly evolve, but it’s not likely software development and deployment will be wholly automated any time soon, he noted.
In the meantime, IT organizations that manage mainframes continue to embrace DevOps best practices to varying degrees. There may be some applications that they continue to build and deploy using waterfall methodologies, but as mainframe applications become integrated into digital business workflows, the need to update them faster becomes more apparent.
The challenge, as always, is not only providing IT teams with the tools to make that transition but also navigating the cultural issues that inevitably arise any time a change of this magnitude is made to the software development life cycle (SDLC). Regardless of the approach, the building and maintenance of mainframe applications are evolving. The only issue that remains to be resolved is finding ways to reduce the level of friction that transition creates as quickly as possible.
To avoid data loss and maintain a level of protection, organizations need a blueprint for adopting their hybrid cloud systems.
Tabnine's generative AI platform for creating test code can make more accurate and personalized recommendations based on specific code and…
EDA and the adoption of event streaming throughout enterprises are essential architectural requirements, but can introduce complexity. Here are three…
By leveraging modern build health tools, DevOps teams can more effectively monitor a project’s overall quality and performance.