The mainframe is alive and well. That big metal beast in the data center is not going anywhere anytime soon—nor are its applications.
“If mainframe applications disappeared tomorrow, planes couldn’t fly, people wouldn’t get paid, and the economy would come to a halt,” wrote Forrester analysts in a recent report, “Digital Transformation Needs Mainframe DevOps.”
I couldn’t agree more.
Mainframe systems are the brain of most enterprises. Unfortunately, many mainframe applications have failed to keep pace with technology innovations, forcing enterprises to update them at any cost.
Those “dinosaur” apps need to be modernized so they can become part of the mainstream IT ecosystem that revolves around the web, the cloud, and everything mobile. To make that leap into the modern software world, mainframe apps need to be re-engineered to fit into what is increasingly a cross-platform, predominantly digital world.
Digital enterprises need to apply the same continuous delivery management practices to mainframe environments that they use for their distributed, web and cloud platforms.
Scripting is Not the Solution
Complex, multi-tier enterprise applications often include back-end mainframe applications and data. Upgrades and enhancements of these applications, therefore, require modifications of application code, data interfaces and other components across multiple platforms simultaneously.
Scripts are virtually impossible to audit and typically depend on the knowledge of the individuals who write them. Furthermore, scripts lack the end-to-end process visibility and auditability vital to the success of continuous delivery.
The Need to Re-Engineer Mainframe Apps
DevOps, which mostly revolves around adopting or improving automation, increasingly is being embraced by enterprises. Having initially chosen DevOps for pilot projects, departmental needs and individual applications, big companies now are taking a broader perspective of DevOps—focusing on their needs to drag their mainframe applications into the modern world.
Most enterprises that embark on transforming their legacy applications realize it is smarter, easier and more cost-effective to build a bridge with modern engineering than to start from scratch. Typically, these companies deploy DevOps principles because research and analysis prove they accelerate the delivery of top-quality software. At the same time, they realize their existing investments in their mainframe applications cannot be thrown away or ignored.
Applying DevOps principles to the mainframe application delivery pipeline can help enterprises cut risk, cost and complexity, while improving their responsiveness to ever-changing customer needs.
Unified deployment means that enterprises might need to work with two or three leading vendors to achieve their vision of automated, cross-platform DevOps. Such a vision typically involves unifying COBOL applications, DB2 databases, z/OS processing power and other mainframe assets into a single, shared pool of enterprise digital resources that IT can adapt wherever and whenever they are required by the business, without operational or staffing impediments.
The Need for a Unifying Software Delivery Infrastructure
In enterprises, it is common to have a broad mix of back-end systems, middleware and various platforms from a vast array of vendors. These technologies might be managed by distributed teams and potentially in different parts of the world as well. When it comes to the process of creating and delivering software, the same level of distribution and diversity also is common.
To connect these islands of technology and expertise, enterprises need a unifying technology that sits on top of everything else, connects to everything, and helps to orchestrate, automate and provide visibility into the process of creating and delivering software. Improving traceability, auditability, security and risk compliance through orchestration and automation of the entire stack of applications is critical to any organization.
What that means in practice is that people with different skills in different parts of the world can collaborate easily without having to learn new skills. For example, if you know all about the mainframe but know nothing about middleware or distributed systems, that’s no problem. You can still contribute to the process through a unified pipeline.
In the same vein of thinking, a unifying technology should reduce, if not eliminate, the gap between highly technical and non-technical contributors. That means the unifying technology should provide a solution where people don’t have to write (or even understand) code to make core changes to software.
Modernizing the way organizations deploy applications to the mainframe is crucial to their success and very survival. Organizations need to embrace change at all levels of their IT infrastructure and begin automating their processes today. By applying DevOps processes and tools to their mainframes, these organizations will help their “dinosaur” applications evolve and survive under the pressures of modern software delivery processes and deadlines.
About the Author / Sunil Mavadia
Sunil Mavadia is director of Customer Success for XebiaLabs, provider of software for continuous delivery and DevOps. A former customer, he was the head of major DevOps transition projects with his previous company. Sunil runs Client Services, including consulting and implementation of the XebiaLabs product suite: XL Release, XL Deploy and XL TestView.