DevOps.com

  • Latest
    • Articles
    • Features
    • Most Read
    • News
    • News Releases
  • Topics
    • AI
    • Continuous Delivery
    • Continuous Testing
    • Cloud
    • Culture
    • DevSecOps
    • Enterprise DevOps
    • Leadership Suite
    • DevOps Practice
    • ROELBOB
    • DevOps Toolbox
    • IT as Code
  • Videos/Podcasts
    • DevOps Chats
    • DevOps Unbound
  • Webinars
    • Upcoming
    • On-Demand Webinars
  • Library
  • Events
    • Upcoming Events
    • On-Demand Events
  • Sponsored Communities
    • AWS Community Hub
    • CloudBees
    • IT as Code
    • Rocket on DevOps.com
    • Traceable on DevOps.com
    • Quali on DevOps.com
  • Related Sites
    • Techstrong Group
    • Container Journal
    • Security Boulevard
    • Techstrong Research
    • DevOps Chat
    • DevOps Dozen
    • DevOps TV
    • Digital Anarchist
  • Media Kit
  • About
  • AI
  • Cloud
  • Continuous Delivery
  • Continuous Testing
  • DevSecOps
  • Leadership Suite
  • Practices
  • ROELBOB
  • Low-Code/No-Code
  • IT as Code
  • More
    • Application Performance Management/Monitoring
    • Culture
    • Enterprise DevOps

Home » Blogs » DevOps Practice » Why Legacy Application Development is a Challenge

Legacy Application Development

Why Legacy Application Development is a Challenge

By: Dale Vecchio on October 3, 2019 Leave a Comment

Enterprises seeking to modernize their legacy mainframe applications often cite the key reason as not being able to evolve quickly enough to support the changing needs of the business. This is also a driving force behind the increasing trend toward moving these legacy applications onto other platforms, either by attempting to rewrite or re-platform them.

Recent Posts By Dale Vecchio
  • How to Modernize Mainframe Applications
More from Dale Vecchio
Related Posts
  • Why Legacy Application Development is a Challenge
  • How to Modernize Mainframe Applications
  • COBOL Is Taking Heat, but These Other Legacy Languages Could Burn You
    Related Categories
  • Application Performance Management/Monitoring
  • Blogs
  • DevOps Practice
  • Enterprise DevOps
    Related Topics
  • CI/CD
  • COBOL
  • development tooling
  • legacy applications
  • legacy apps
  • legacy mainframe applications
  • mainframe
Show more
Show less

If this was a simple process however, the transition would have happened years ago. So, why is it so difficult to modernize mainframe applications? It is surely not the programming language. COBOL and other traditional languages are simply programming languages. They may not be considered “cool,” but neither are many of the other scripting or development languages, or configuration syntax programmers deal with day in and day out.

DevOps Connect:DevSecOps @ RSAC 2022

So, what actually makes legacy application modernization challenging? After countless conversations with several outsourcers, system integrators and mainframe customers, four major issues came up time and time again.

Development Tooling–Low Productivity on Mainframe

The traditional environment for a programmer, working with COBOL or PL/1 on the mainframe, has few of the productivity features today’s programmers expect: smart cross-references, on-the-fly syntax checking, autocomplete for variables etc. Attempts have been made to bring the experience closer to that of a modern JAVA programmer, but there is no escaping the fact a mainframe exists in the background with all its quirks. This divergence in programmer productivity unfairly tends to stick to the programming languages on the mainframe. With the result that language choice gets conflated with developer tooling in decisions to make mainframe development more agile.

Open-Source Availability–Limited Access to Innovation

Looking for solutions to business requirements among the many popular open-source projects is one of the fundamental reasons why development processes in general have become more agile. Clearly, if less code is needed to support any requirement, the faster that requirement will be satisfied. Very few open-source projects have been implemented and tested on mainframes. Perhaps even more troubling is the difficulty of integrating a Linux-oriented solution with legacy applications in a natural way.

Pipeline Integration–Separate Pipeline and Processes for Mainframe Development

In a similar vein to developer tooling, modern development pipelines do not fit naturally with the way in which mainframe applications move from development into production. Build processes, testing, source-code management and promotion are integrated and streamlined within a modern development pipeline. Specifically, modern tools enable developers to promote locally-tested changes and have those changes flow through an automated testing pipeline, with steadily increased levels of integration with existing applications and data. The pipeline is facilitated by the tight integration of tools such as GIT, Eclipse, Jenkins, Containers, etc. to leverage the best practices of CI/CD. All working in harmony with an agile development organization to get application updates to production faster. The mainframe is not as efficient, mainly due to nascent support for many of these technologies, alongside the costs of running many of the automated processes repeatedly.

Testing–Limited Autonomy and Automation

Testing, as part of a modern development pipeline is a highly automated and scalable process. The ability to launch containers with a single click is a fundamental reason why development is faster today than in the mainframe era.

Development expectations for testing extend to being able to launch a complete testing environment on the developer workstation, without recourse to systems administrators, for the recurring small unit tests intertwined with the development itself. Due to the physical hardware dependency of mainframe development, this testing autonomy can be highly impractical for legacy applications.

Furthermore, large scale-tests, which can be regularly, automatically and cost-effectively scheduled in modern, container-based cloud environments, require extensive budgeting and planning for legacy applications.

Consequently, all testing of legacy application developments is extended by the need to involve mainframe systems administrators, and schedule expensive mainframe resources.

Solving the Issues

To address the issues above, a solution is required which breaks the software lock-in that ties a legacy application to the mainframe. It is possible to rewrite or re-platform existing applications, although this can be time-consuming, costly and risky. A more beneficial approach is to rehost the mainframe applications by migrating them to the cloud, without recompilation or conversion of data types. This method replaces the proprietary, mainframe hardware and software dependencies with a set of Linux libraries, requiring less time and fewer resources to execute.

Linux libraries can be created to include features that recreate the behaviors of the operating system, databases, security and language runtime that previously tied the application to the mainframe hardware. In addition, the incompatibility of the mainframe instruction-set with x86 hardware can be managed via a virtual machine.

If the mainframe is represented as a set of libraries in the same way as any other more conventional application dependency, then the entire problem of legacy application development is reduced. Any workstation or commodity server can fully virtualize the mainframe environment, in several instances in parallel–if needed–thanks to containers. And with total elasticity when on the cloud.

Migrating to the cloud makes it possible for a developer to spin up a full test environment for legacy applications on their workstation, in the same way as they would for a Java or C# program. The entire development cycle for legacy applications can follow the same pipeline as for other applications. The same agile models, and CI/CD DevOps policies with strong emphasis on shift-left testing can be employed. The same automated scale testing is possible. The same exploitation of open-source projects can be considered.

Once rehosted, the word “legacy” can be dropped from the description of these applications. They simply become applications written in COBOL or PL/1. Everything else about their ongoing enhancement is identical to those applications written in modern languages.

The highlighted issues that businesses seeking agility are facing, are only issues that exist on the traditional mainframe. Migrating legacy applications to a software defined mainframe, running on an x86 architecture provides a modern dynamic environment in which to develop, innovate, integrate and test applications.

— Dale Vecchio

Filed Under: Application Performance Management/Monitoring, Blogs, DevOps Practice, Enterprise DevOps Tagged With: CI/CD, COBOL, development tooling, legacy applications, legacy apps, legacy mainframe applications, mainframe

Sponsored Content
Featured eBook
The 101 of Continuous Software Delivery

The 101 of Continuous Software Delivery

Now, more than ever, companies who rapidly react to changing market conditions and customer behavior will have a competitive edge.  Innovation-driven response is successful not only when a company has new ideas, but also when the software needed to implement them is delivered quickly. Companies who have weathered recent events ... Read More
« CollabNet VersionOne’s Value Stream Infrastructure Supports Business Agility with the SAFe Framework
TDD, Unit Testing and the Mainframe »

TechStrong TV – Live

Click full-screen to enable volume control
Watch latest episodes and shows

Upcoming Webinars

Deploying Microservices With Pulumi & AWS Lambda
Tuesday, June 28, 2022 - 3:00 pm EDT
Boost Your Java/JavaScript Skills With a Multi-Experience Platform
Wednesday, June 29, 2022 - 3:30 pm EDT
Closing the Gap: Reducing Enterprise AppSec Risks Without Disrupting Deadlines
Thursday, June 30, 2022 - 11:00 am EDT

Latest from DevOps.com

Developer’s Guide to Web Application Security
June 24, 2022 | Anas Baig
Cloudflare Outage Outrage | Yet More FAA 5G Stupidity
June 23, 2022 | Richi Jennings
The Age of Software Supply Chain Disruption
June 23, 2022 | Bill Doerrfeld
Four Steps to Avoiding a Cloud Cost Incident
June 22, 2022 | Asim Razzaq
At Some Point, We’ve Shifted Too Far Left
June 22, 2022 | Don Macvittie

Get The Top Stories of the Week

  • View DevOps.com Privacy Policy
  • This field is for validation purposes and should be left unchanged.

Download Free eBook

DevOps: Mastering the Human Element
DevOps: Mastering the Human Element

Most Read on DevOps.com

Survey Uncovers Depth of Open Source Software Insecurity
June 21, 2022 | Mike Vizard
One Year Out: What Biden’s EO Means for Software Devs
June 20, 2022 | Tim Mackey
Open Source Coder Tool Helps Devs Build Cloud Spaces
June 20, 2022 | Mike Vizard
Not Everything That is Necessary Adds Value
June 20, 2022 | Lance Knight
TechStrong Con: Downturn Brings Additional Sense of DevOps U...
June 21, 2022 | Mike Vizard

On-Demand Webinars

DevOps.com Webinar ReplaysDevOps.com Webinar Replays
  • Home
  • About DevOps.com
  • Meet our Authors
  • Write for DevOps.com
  • Media Kit
  • Sponsor Info
  • Copyright
  • TOS
  • Privacy Policy

Powered by Techstrong Group, Inc.

© 2022 ·Techstrong Group, Inc.All rights reserved.