DevOps Practice

Why Legacy Application Development is a Challenge

Enterprises seeking to modernize their legacy mainframe applications often cite the key reason as not being able to evolve quickly enough to support the changing needs of the business. This is also a driving force behind the increasing trend toward moving these legacy applications onto other platforms, either by attempting to rewrite or re-platform them.

If this was a simple process however, the transition would have happened years ago. So, why is it so difficult to modernize mainframe applications? It is surely not the programming language. COBOL and other traditional languages are simply programming languages. They may not be considered “cool,” but neither are many of the other scripting or development languages, or configuration syntax programmers deal with day in and day out.

So, what actually makes legacy application modernization challenging? After countless conversations with several outsourcers, system integrators and mainframe customers, four major issues came up time and time again.

Development Tooling–Low Productivity on Mainframe

The traditional environment for a programmer, working with COBOL or PL/1 on the mainframe, has few of the productivity features today’s programmers expect: smart cross-references, on-the-fly syntax checking, autocomplete for variables etc. Attempts have been made to bring the experience closer to that of a modern JAVA programmer, but there is no escaping the fact a mainframe exists in the background with all its quirks. This divergence in programmer productivity unfairly tends to stick to the programming languages on the mainframe. With the result that language choice gets conflated with developer tooling in decisions to make mainframe development more agile.

Open-Source Availability–Limited Access to Innovation

Looking for solutions to business requirements among the many popular open-source projects is one of the fundamental reasons why development processes in general have become more agile. Clearly, if less code is needed to support any requirement, the faster that requirement will be satisfied. Very few open-source projects have been implemented and tested on mainframes. Perhaps even more troubling is the difficulty of integrating a Linux-oriented solution with legacy applications in a natural way.

Pipeline Integration–Separate Pipeline and Processes for Mainframe Development

In a similar vein to developer tooling, modern development pipelines do not fit naturally with the way in which mainframe applications move from development into production. Build processes, testing, source-code management and promotion are integrated and streamlined within a modern development pipeline. Specifically, modern tools enable developers to promote locally-tested changes and have those changes flow through an automated testing pipeline, with steadily increased levels of integration with existing applications and data. The pipeline is facilitated by the tight integration of tools such as GIT, Eclipse, Jenkins, Containers, etc. to leverage the best practices of CI/CD. All working in harmony with an agile development organization to get application updates to production faster. The mainframe is not as efficient, mainly due to nascent support for many of these technologies, alongside the costs of running many of the automated processes repeatedly.

Testing–Limited Autonomy and Automation

Testing, as part of a modern development pipeline is a highly automated and scalable process. The ability to launch containers with a single click is a fundamental reason why development is faster today than in the mainframe era.

Development expectations for testing extend to being able to launch a complete testing environment on the developer workstation, without recourse to systems administrators, for the recurring small unit tests intertwined with the development itself. Due to the physical hardware dependency of mainframe development, this testing autonomy can be highly impractical for legacy applications.

Furthermore, large scale-tests, which can be regularly, automatically and cost-effectively scheduled in modern, container-based cloud environments, require extensive budgeting and planning for legacy applications.

Consequently, all testing of legacy application developments is extended by the need to involve mainframe systems administrators, and schedule expensive mainframe resources.

Solving the Issues

To address the issues above, a solution is required which breaks the software lock-in that ties a legacy application to the mainframe. It is possible to rewrite or re-platform existing applications, although this can be time-consuming, costly and risky. A more beneficial approach is to rehost the mainframe applications by migrating them to the cloud, without recompilation or conversion of data types. This method replaces the proprietary, mainframe hardware and software dependencies with a set of Linux libraries, requiring less time and fewer resources to execute.

Linux libraries can be created to include features that recreate the behaviors of the operating system, databases, security and language runtime that previously tied the application to the mainframe hardware. In addition, the incompatibility of the mainframe instruction-set with x86 hardware can be managed via a virtual machine.

If the mainframe is represented as a set of libraries in the same way as any other more conventional application dependency, then the entire problem of legacy application development is reduced. Any workstation or commodity server can fully virtualize the mainframe environment, in several instances in parallel–if needed–thanks to containers. And with total elasticity when on the cloud.

Migrating to the cloud makes it possible for a developer to spin up a full test environment for legacy applications on their workstation, in the same way as they would for a Java or C# program. The entire development cycle for legacy applications can follow the same pipeline as for other applications. The same agile models, and CI/CD DevOps policies with strong emphasis on shift-left testing can be employed. The same automated scale testing is possible. The same exploitation of open-source projects can be considered.

Once rehosted, the word “legacy” can be dropped from the description of these applications. They simply become applications written in COBOL or PL/1. Everything else about their ongoing enhancement is identical to those applications written in modern languages.

The highlighted issues that businesses seeking agility are facing, are only issues that exist on the traditional mainframe. Migrating legacy applications to a software defined mainframe, running on an x86 architecture provides a modern dynamic environment in which to develop, innovate, integrate and test applications.

Dale Vecchio

Dale Vecchio

Dale Vecchio is Chief Marketing Officer at LzLabs. Dale spent 18 years as Research Vice President at Gartner, where he specialized in strategies for modernizing application portfolios. Before becoming one of the world’s foremost analysts in the realm of application modernization, Dale began his career as a mainframe application developer, moving on to become a systems programmer and later, a director and vice president of marketing across a range of technology companies, including Cincom Systems, Systems Center and Viasoft.

Recent Posts

AIOps Success Requires Synthetic Internet Telemetry Data

The data used to train AI models needs to reflect the production environments where applications are deployed.

1 day ago

Five Great DevOps Jobs Opportunities

Looking for a DevOps job? Look at these openings at NBC Universal, BAE, UBS, and other companies with three-letter abbreviations.

2 days ago

Tricentis Taps Generative AI to Automate Application Testing

Tricentis is adding AI assistants to make it simpler for DevOps teams to create tests.

3 days ago

Valkey is Rapidly Overtaking Redis

Redis is taking it in the chops, as both maintainers and customers move to the Valkey Redis fork.

4 days ago

GitLab Adds AI Chat Interface to Increase DevOps Productivity

GitLab Duo Chat is a natural language interface which helps generate code, create tests and access code summarizations.

4 days ago

The Role of AI in Securing Software and Data Supply Chains

Expect attacks on the open source software supply chain to accelerate, with attackers automating attacks in common open source software…

5 days ago